Java – Apache spark, create hive context – nosuchmethodexception
•
Java
I have the following questions. My main methods are:
static public void main(String args[]){ SparkConf conf = new SparkConf().setAppName("TestHive"); SparkContext sc = new org.apache.spark.SparkContext(conf); HiveContext hiveContext = new org.apache.spark.sql.hive.HiveContext(sc); }
I built it with the MVN package and then I submitted my code, but I got the following exception I don't know what's wrong:
sh spark-submit --class "TestHive" --master local[4] ~/target/test-1.0-SNAPSHOT-jar-with-dependencies.jar Exception in thread "main" java.lang.NoSuchMethodException: org.apache.hadoop.hive.conf.HiveConf.getTimeVar(org.apache.hadoop.hive.conf.HiveConf$ConfVars,java.util.concurrent.TimeUnit)
Please tell me I'm wrong
PS I built my spark with hive and thriftserver
Spark 1.5.2 built for Hadoop 2.4.0 Build flags: -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
Solution
It seems to be a version conflict between spark core, spark SQL and spark hive components
To avoid this confusion, all versions of these components should be the same You can set the name to spark Version of peroperty comes in POM XML, for example:
<properties> <spark.version>1.6.0</spark.version> </properties> <dependencies> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>${spark.version}</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-hive_2.10</artifactId> <version>${spark.version}</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.10</artifactId> <version>${spark.version}</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.10</artifactId> <version>${spark.version}</version> </dependency> </dependencies>
The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
二维码