Some discussion is there in https://github.com/dibbhatt/kafka-spark-consumer
and some is mentioned in https://issues.apache.org/jira/browse/SPARK-11045
Let me know if those answer your question .
In short, Direct Stream is good choice if you need exact once semantics and
message ordering , but ma
Right .. if you are using github version, just modify the ReceiverLauncher
and add that . I will fix it for Spark 1.6 and release new version in
spark-packages for spark 1.6
Dibyendu
On Thu, Jan 7, 2016 at 4:14 PM, Ted Yu wrote:
> I cloned g...@github.com:dibbhatt/kafka-spark-consumer.git a mom
Hi,
We have been using spark streaming for a little while now.
Until now, we were running our spark streaming jobs in spark 1.5.1 and it
was working well. Yesterday, we upgraded to spark 1.6.0 without any changes
in the code. But our streaming jobs are not working any more. We are
getting an "Abs