You seem to combining Scala 2.10 and 2.11 libraries - your sbt project is 2.11, where as you are trying to pull in spark-streaming-kafka-assembly_ *2.10*-1.6.1.jar.
On Fri, Aug 19, 2016 at 11:24 AM, Mich Talebzadeh <mich.talebza...@gmail.com > wrote: > Hi, > > My spark streaming app with 1.6.1 used to work. > > Now with > > scala> sc version > res0: String = 2.0.0 > > Compiling with sbt assembly as before, with the following: > > version := "1.0", > scalaVersion := "2.11.8", > mainClass in Compile := Some("myPackage.${APPLICATION}") > ) > libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0" % > "provided" > libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.0.0" % > "provided" > libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.0.0" % > "provided" > libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.0.0" % > "provided" > libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" % > "1.6.1" % "provided" > > > I downgradedscalaVersion to 2.10.4, it did not change. > > It compiles OK but at run time it fails > > This Jar is added to spark-summit > > --jars /home/hduser/jars/spark-streaming-kafka-assembly_2.10-1.6.1.jar \ > > And this is the error > > Exception in thread "main" java.lang.NoClassDefFoundError: > scala/collection/GenTraversableOnce$class > at kafka.utils.Pool.<init>(Pool.scala:28) > at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<init>( > FetchRequestAndResponseStats.scala:60) > at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<clinit>( > FetchRequestAndResponseStats.scala) > at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39) > at org.apache.spark.streaming.kafka.KafkaCluster.connect( > KafkaCluster.scala:52) > at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$ > org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply( > KafkaCluster.scala:345) > at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$ > org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply( > KafkaCluster.scala:342) > at scala.collection.IndexedSeqOptimized$class. > foreach(IndexedSeqOptimized.scala:33) > at scala.collection.mutable.WrappedArray.foreach( > WrappedArray.scala:35) > at org.apache.spark.streaming.kafka.KafkaCluster.org$apache$ > spark$streaming$kafka$KafkaCluster$$withBrokers(KafkaCluster.scala:342) > at org.apache.spark.streaming.kafka.KafkaCluster. > getPartitionMetadata(KafkaCluster.scala:125) > at org.apache.spark.streaming.kafka.KafkaCluster. > getPartitions(KafkaCluster.scala:112) > at org.apache.spark.streaming.kafka.KafkaUtils$. > getFromOffsets(KafkaUtils.scala:211) > at org.apache.spark.streaming.kafka.KafkaUtils$. > createDirectStream(KafkaUtils.scala:484) > at CEP_streaming$.main(CEP_streaming.scala:123) > at CEP_streaming.main(CEP_streaming.scala) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke( > NativeMethodAccessorImpl.java:62) > at sun.reflect.DelegatingMethodAccessorImpl.invoke( > DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$ > deploy$SparkSubmit$$runMain(SparkSubmit.scala:729) > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1( > SparkSubmit.scala:185) > at org.apache.spark.deploy.SparkSubmit$.submit( > SparkSubmit.scala:210) > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit. > scala:124) > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > Caused by: java.lang.ClassNotFoundException: scala.collection. > GenTraversableOnce$class > at java.net.URLClassLoader.findClass(URLClassLoader.java:381) > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > > > Any ideas appreciated > > Dr Mich Talebzadeh > > > > LinkedIn * > https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw > <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>* > > > > http://talebzadehmich.wordpress.com > > > *Disclaimer:* Use it at your own risk. Any and all responsibility for any > loss, damage or destruction of data or any other property which may arise > from relying on this email's technical content is explicitly disclaimed. > The author will in no case be liable for any monetary damages arising from > such loss, damage or destruction. > > >