Here is the error after build with scala 2.10 “ Spark Command: /usr/lib/jvm/java-1.8.0/bin/java -cp /home/raymond.honderdors/Documents/IdeaProjects/spark/conf/:/home/raymond.honderdors/Documents/IdeaProjects/spark/assembly/target/scala-2.10/jars/* -Xms5g -Xmx5g org.apache.spark.deploy.SparkSubmit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 spark-internal ======================================== Exception in thread "main" java.lang.IncompatibleClassChangeError: Implementing class at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:763) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) at java.net.URLClassLoader.access$100(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:368) at java.net.URLClassLoader$1.run(URLClassLoader.java:362) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:361) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.getDeclaredMethods0(Native Method) at java.lang.Class.privateGetDeclaredMethods(Class.java:2701) at java.lang.Class.privateGetMethodRecursive(Class.java:3048) at java.lang.Class.getMethod0(Class.java:3018) at java.lang.Class.getMethod(Class.java:1784) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:710) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) ”
Raymond Honderdors Team Lead Analytics BI Business Intelligence Developer raymond.honderd...@sizmek.com<mailto:raymond.honderd...@sizmek.com> T +972.7325.3569 Herzliya From: Raymond Honderdors [mailto:raymond.honderd...@sizmek.com] Sent: Tuesday, April 05, 2016 4:23 PM To: Reynold Xin <r...@databricks.com> Cc: dev@spark.apache.org Subject: RE: Build with Thrift Server & Scala 2.11 I can see that the build is successful (-Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver –Dscala-2.11 -DskipTests clean package) the documents page it still says that “ Building With Hive and JDBC Support To enable Hive integration for Spark SQL along with its JDBC server and CLI, add the -Phive and Phive-thriftserver profiles to your existing build options. By default Spark will build with Hive 0.13.1 bindings. # Apache Hadoop 2.4.X with Hive 13 support mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver -DskipTests clean package Building for Scala 2.11 To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11 property: ./dev/change-scala-version.sh 2.11 mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package Spark does not yet support its JDBC component for Scala 2.11. ” Source : http://spark.apache.org/docs/latest/building-spark.html When I try to start the thrift server I get the following error: “ 16/04/05 16:09:11 INFO BlockManagerMaster: Registered BlockManager 16/04/05 16:09:12 ERROR SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: java.net.UnknownHostException: namenode at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374) at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:312) at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:601) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370) at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1667) at org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:67) at org.apache.spark.SparkContext.<init>(SparkContext.scala:517) at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:57) at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:77) at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:726) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.net.UnknownHostException: namenode ... 26 more 16/04/05 16:09:12 INFO SparkUI: Stopped Spark web UI at http://10.10.182.195:4040 16/04/05 16:09:12 INFO SparkDeploySchedulerBackend: Shutting down all executors ” Raymond Honderdors Team Lead Analytics BI Business Intelligence Developer raymond.honderd...@sizmek.com<mailto:raymond.honderd...@sizmek.com> T +972.7325.3569 Herzliya From: Reynold Xin [mailto:r...@databricks.com] Sent: Tuesday, April 05, 2016 3:57 PM To: Raymond Honderdors <raymond.honderd...@sizmek.com<mailto:raymond.honderd...@sizmek.com>> Cc: dev@spark.apache.org<mailto:dev@spark.apache.org> Subject: Re: Build with Thrift Server & Scala 2.11 What do you mean? The Jenkins build for Spark uses 2.11 and also builds the thrift server. On Tuesday, April 5, 2016, Raymond Honderdors <raymond.honderd...@sizmek.com<mailto:raymond.honderd...@sizmek.com>> wrote: Is anyone looking into this one, Build with Thrift Server & Scala 2.11? I9f so when can we expect it Raymond Honderdors Team Lead Analytics BI Business Intelligence Developer raymond.honderd...@sizmek.com<javascript:_e(%7B%7D,'cvml','raymond.honderd...@sizmek.com');> T +972.7325.3569 Herzliya [Read More]<http://feeds.feedburner.com/~r/sizmek-blog/~6/1> [http://www.sizmek.com/Sizmek.png]<http://www.sizmek.com/>