Hello Everyone,

Thank you for the time and the help :).

My goal here is to get this program working:
https://github.com/apache/spark/blob/master/examples/scala-2.10/src/main/java/org/apache/spark/examples/streaming/JavaKafkaWordCount.java

The only lines I do not have from the example are lines 62-67. pom.xml
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n20879/pom.xml>  

Background: Have ec2 instances running. The standalone spark is running on
top of Cloudera Manager 5.2.

Pom file is attached and the same for both clusters.
pom.xml
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n20879/pom.xml>  

Here are a few different approaches I have taken and the issues I run into:

*Standalone Mode*

1) Use spark-submit script to run: 

/opt/cloudera/parcels/CDH-5.2.1-1.cdh5.2.1.p0.12/lib/spark/bin/spark-submit
--class SimpleApp --master spark://10.0.1.230:7077  --jars $(echo
/home/ec2-user/sparkApps/SimpleApp/lib/*.jar | tr ' ' ',')
/home/ec2-user/sparkApps/SimpleApp/target/simple-project-1.0.jar

Interesting...I was getting an error like this: Initial job has not accepted
any resources; check your cluster UI

Now, when I run, it prints out the 3 Hello world statements in my code:
KafkaJavaConsumer.txt
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n20879/KafkaJavaConsumer.txt>
  

and then it seems to try to start the Kafka Stream, but fails:

14/12/29 05:58:05 INFO KafkaReceiver: Starting Kafka Consumer Stream with
group: c1
14/12/29 05:58:05 INFO ReceiverTracker: Registered receiver for stream 0
from akka://sparkDriver
14/12/29 05:58:05 INFO KafkaReceiver: Connecting to Zookeeper:
10.0.1.232:2181
14/12/29 05:58:05 INFO BlockGenerator: Started block pushing thread
14/12/29 05:58:05 INFO ReceiverSupervisorImpl: Stopping receiver with
message: Error starting receiver 0: java.lang.NoClassDefFoundError:
scala/reflect/ClassManifest
14/12/29 05:58:05 INFO ReceiverSupervisorImpl: Called receiver onStop
14/12/29 05:58:05 INFO ReceiverSupervisorImpl: Deregistering receiver 0
^C14/12/29 05:58:05 ERROR ReceiverTracker: Deregistered receiver for stream
0: Error starting receiver 0 - java.lang.NoClassDefFoundError:
scala/reflect/ClassManifest
        at kafka.utils.Log4jController$.<init>(Log4jController.scala:29)
        at kafka.utils.Log4jController$.<clinit>(Log4jController.scala)
        at kafka.utils.Logging$class.$init$(Logging.scala:29)
        at
kafka.utils.VerifiableProperties.<init>(VerifiableProperties.scala:26)
        at kafka.consumer.ConsumerConfig.<init>(ConsumerConfig.scala:94)
        at
org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:96)
        at
org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
        at
org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
        at
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264)
        at
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257)
        at
org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        at
org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        at
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
        at org.apache.spark.scheduler.Task.run(Task.scala:54)
        at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:180)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
        at java.lang.Thread.run(Thread.java:722)
Caused by: java.lang.ClassNotFoundException: scala.reflect.ClassManifest
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
        ... 18 more

14/12/29 05:58:05 INFO ReceiverSupervisorImpl: Stopped receiver 0
14/12/29 05:58:05 INFO BlockGenerator: Stopping BlockGenerator

I ran into a couple other Class not found errors, and was able to solve them
by adding dependencies on the pom file, but have not found such a solution
to this error.

On the Kafka side of things, I am simply typing in messages as soon as I
start the Java app on another console. Is this okay?

I have not set up an advertised host on the kafka side as I was able to
still receive messages from other consoles by setting up a consumer to
listen to the private ip:port. Is this okay?

Lastly, is there command, like --from-beginning for a consumer in the java
application to get messages from the beginning? 

Thanks a lot for the help and happy holidays!







--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Setting-up-Simple-Kafka-Consumer-via-Spark-Java-app-tp20879.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to