Java – missing application resources
•
Java
Write Spark Program in Java. The code is as follows:
import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaRDD; import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.api.java.function.Function; public class SimpleApp { public static void main(String[] args) { SparkConf conf = new SparkConf().setAppName("wordCount").setMaster("local"); JavaSparkContext sc = new JavaSparkContext(conf); JavaRDD<String> input = sc.textFile("/bigdata/softwares/spark-2.1.0-bin-hadoop2.7/testdata/a.txt"); System.out.println(); Long bCount = input.filter(new Function<String,Boolean>(){ public Boolean call(String s){return s.contains("yes");} }).count(); Long cCount = input.filter(new Function<String,Boolean>(){ public Boolean call(String s){return s.contains("ywq");} }).count(); System.out.println("yes:"+bCount+" ywq:"+cCount+" all:"); // sc.stop(); } }
POM is as follows:
<dependencies> <dependency> <!-- Spark dependency --> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>2.1.0</version> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <version>2.3</version> <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> </execution> </executions> </plugin> </plugins> </build>
The following error occurred. Maven packaged all resources into jar files. The runtime reported the following error. I just started learning. Who knows? Thank you for entering image description here
Solution
You must also specify the main class using spark submit
spark-submit --class <your.package>.SimpleApp testjar/spark-0.0.1-SNAPSHOT.jar
The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
二维码