Integrating Kafka 0.10 or higher with Spark 2.1.1 -- required jars

2017-07-07 Thread mahendra
Hi,

   After finding that support for streaming api in python is not present for
kafka brokers version > 0.10 . I was trying to run a kafka example in scala.
I used following spark submit command :

/usr/local/spark/bin/spark-submit --jars
/home/mahendra/spark-streaming-kafka-0-10_2.11-2.1.1.jar,/home/mahendra/spark-streaming-kafka-0-10-assembly_2.11-2.1.1.jar
--class org.apache.spark.examples.streaming.KafkaWordCount
/usr/local/spark/examples/jars/spark-examples_2.11-2.1.1.jar 10.0.16.96:2181
group_test streams 6
 
Basically I'm trying to include jars :
spark-streaming-kafka-0-10_2.11-2.1.1.jar,
spark-streaming-kafka-0-10-assembly_2.11-2.1.1.jar

But it throws this exception :

Exception in thread "main" java.lang.NoClassDefFoundError:
kafka/serializer/StringDecoder
at
org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:66)
at
org.apache.spark.examples.streaming.KafkaWordCount$.main(KafkaWordCount.scala:57)
at
org.apache.spark.examples.streaming.KafkaWordCount.main(KafkaWordCount.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: kafka.serializer.StringDecoder
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 12 more

Any idea if I have missed some related jars ?

Thanks,
Mahendra 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Integrating-Kafka-0-10-or-higher-with-Spark-2-1-1-required-jars-tp28830.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Integrating Kafka 0.10 or higher with Spark 2.1.1 -- required jars

2017-07-07 Thread mahendra singh meena
Hi,

   After finding that support for streaming api in python is not present
for kafka brokers version > 0.10 . I was trying to run a kafka example in
scala. I used following spark submit command :

/usr/local/spark/bin/spark-submit --jars /home/mahendra/spark-
> streaming-kafka-0-10_2.11-2.1.1.jar,/home/mahendra/spark-
> streaming-kafka-0-10-assembly_2.11-2.1.1.jar --class
> org.apache.spark.examples.streaming.KafkaWordCount
> /usr/local/spark/examples/jars/spark-examples_2.11-2.1.1.jar
> 10.0.16.96:2181 group_test streams 6
>

Basically I'm trying to include jars :
spark-streaming-kafka-0-10_2.11-2.1.1.jar,
spark-streaming-kafka-0-10-assembly_2.11-2.1.1.jar

But it throws this exception :

Exception in thread "main" java.lang.NoClassDefFoundError:
>> kafka/serializer/StringDecoder
>
> at org.apache.spark.streaming.kafka.KafkaUtils$.
>> createStream(KafkaUtils.scala:66)
>
> at org.apache.spark.examples.streaming.KafkaWordCount$.
>> main(KafkaWordCount.scala:57)
>
> at org.apache.spark.examples.streaming.KafkaWordCount.main(
>> KafkaWordCount.scala)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke(
>> NativeMethodAccessorImpl.java:62)
>
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
>> DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:498)
>
> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$
>> deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
>
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
>
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
>
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
>
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> Caused by: java.lang.ClassNotFoundException:
>> kafka.serializer.StringDecoder
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
> ... 12 more
>
>
Any idea if I have missed some related jars ?

Thanks,
Mahendra


Integrating Kafka 0.10 or higher with Spark 2.1.1 -- required jars

2017-07-07 Thread mahendra singh meena
Hi,

   After finding that support for streaming api in python is not present
for kafka brokers version > 0.10 . I was trying to run a kafka example in
scala. I used following spark submit command :

/usr/local/spark/bin/spark-submit --jars
> /home/mahendra/spark-streaming-kafka-0-10_2.11-2.1.1.jar,/home/mahendra/spark-streaming-kafka-0-10-assembly_2.11-2.1.1.jar
> --class org.apache.spark.examples.streaming.KafkaWordCount
> /usr/local/spark/examples/jars/spark-examples_2.11-2.1.1.jar
> 10.0.16.96:2181 group_test streams 6
>

Basically I'm trying to include jars :
spark-streaming-kafka-0-10_2.11-2.1.1.jar,
spark-streaming-kafka-0-10-assembly_2.11-2.1.1.jar

But it throws this exception :

Exception in thread "main" java.lang.NoClassDefFoundError:
>> kafka/serializer/StringDecoder
>
> at
>> org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:66)
>
> at
>> org.apache.spark.examples.streaming.KafkaWordCount$.main(KafkaWordCount.scala:57)
>
> at
>> org.apache.spark.examples.streaming.KafkaWordCount.main(KafkaWordCount.scala)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:498)
>
> at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
>
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
>
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
>
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
>
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> Caused by: java.lang.ClassNotFoundException: kafka.serializer.StringDecoder
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
> ... 12 more
>
>
Any idea if I have missed some related jars ?

Thanks,
Mahendra