Re: Kafka Spark Streaming on Spark 1.1

2014-09-18 Thread JiajiaJing
Yeah, I forgot to build the new jar file for spark 1.1...
And now the errors are gone. 

Thank you very much!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Kafka-Spark-Streaming-on-Spark-1-1-tp14597p14604.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Kafka Spark Streaming on Spark 1.1

2014-09-18 Thread Tim Smith
What kafka receiver are you using? Did you build a new jar for your
app with the latest streaming-kafka code for 1.1?


On Thu, Sep 18, 2014 at 11:47 AM, JiajiaJing  wrote:
> Hi Spark Users,
>
> We just upgrade our spark version from 1.0 to 1.1. And we are trying to
> re-run all the written and tested projects we implemented on Spark 1.0.
> However, when we try to execute the spark streaming project that stream data
> from Kafka topics, it yields the following error message. I have no idea
> about why this occurs because the same project runs successfully with Spark
> 1.0.
> May I get some help on this please?
>
> Thank you very much!
>
>
> 2014-09-18 11:06:08,841 ERROR [sparkDriver-akka.actor.default-dispatcher-4]
> scheduler.ReceiverTracker (Logging.scala:logError(75)) - Deregistered
> receiver for stream 0: Error starting receiver 0 -
> java.lang.AbstractMethodError
> at org.apache.spark.Logging$class.log(Logging.scala:52)
> at
> org.apache.spark.streaming.kafka.KafkaReceiver.log(KafkaInputDStream.scala:66)
> at org.apache.spark.Logging$class.logInfo(Logging.scala:59)
> at
> org.apache.spark.streaming.kafka.KafkaReceiver.logInfo(KafkaInputDStream.scala:66)
> at
> org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:86)
> at
> org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
> at
> org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
> at
> org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264)
> at
> org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257)
> at
> org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
> at
> org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
> at org.apache.spark.scheduler.Task.run(Task.scala:54)
> at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> at java.lang.Thread.run(Thread.java:662)
>
>
>
>
> Best Regards,
>
> Jiajia
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Kafka-Spark-Streaming-on-Spark-1-1-tp14597.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Kafka Spark Streaming on Spark 1.1

2014-09-18 Thread JiajiaJing
Hi Spark Users,

We just upgrade our spark version from 1.0 to 1.1. And we are trying to
re-run all the written and tested projects we implemented on Spark 1.0. 
However, when we try to execute the spark streaming project that stream data
from Kafka topics, it yields the following error message. I have no idea
about why this occurs because the same project runs successfully with Spark
1.0. 
May I get some help on this please?

Thank you very much!


2014-09-18 11:06:08,841 ERROR [sparkDriver-akka.actor.default-dispatcher-4]
scheduler.ReceiverTracker (Logging.scala:logError(75)) - Deregistered
receiver for stream 0: Error starting receiver 0 -
java.lang.AbstractMethodError
at org.apache.spark.Logging$class.log(Logging.scala:52)
at
org.apache.spark.streaming.kafka.KafkaReceiver.log(KafkaInputDStream.scala:66)
at org.apache.spark.Logging$class.logInfo(Logging.scala:59)
at
org.apache.spark.streaming.kafka.KafkaReceiver.logInfo(KafkaInputDStream.scala:66)
at
org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:86)
at
org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
at
org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
at
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264)
at
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257)
at
org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
at
org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
at org.apache.spark.scheduler.Task.run(Task.scala:54)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)




Best Regards,

Jiajia



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Kafka-Spark-Streaming-on-Spark-1-1-tp14597.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org