Thanks guys.
On Wed, Sep 23, 2015 at 3:54 PM, Tathagata Das wrote:
> SPARK_CLASSPATH is I believe deprecated right now. So I am not surprised
> that there is some difference in the code paths.
>
> On Tue, Sep 22, 2015 at 9:45 PM, Saisai Shao
>
I must have been gone mad :) Thanks for pointing it out. I downloaded 1.5.0
assembly jar and added it in SPARK_CLASSPATH.
However, I am getting a new error now
>>> kvs =
KafkaUtils.createDirectStream(ssc,['spark'],{"metadata.broker.list":'l
ocalhost:9092'})
I think it is something related to class loader, the behavior is different
for classpath and --jars. If you want to know the details I think you'd
better dig out some source code.
Thanks
Jerry
On Tue, Sep 22, 2015 at 9:10 PM, ayan guha wrote:
> I must have been gone mad :)
SPARK_CLASSPATH is I believe deprecated right now. So I am not surprised
that there is some difference in the code paths.
On Tue, Sep 22, 2015 at 9:45 PM, Saisai Shao wrote:
> I think it is something related to class loader, the behavior is different
> for classpath and
I think you're using the wrong version of kafka assembly jar, I think
Python API from direct Kafka stream is not supported for Spark 1.3.0, you'd
better change to version 1.5.0, looks like you're using Spark 1.5.0, why
you choose Kafka assembly 1.3.0?
Hi
I have added spark assembly jar to SPARK CLASSPATH
>>> print os.environ['SPARK_CLASSPATH']
D:\sw\spark-streaming-kafka-assembly_2.10-1.3.0.jar
Now I am facing below issue with a test topic
>>> ssc = StreamingContext(sc, 2)
>>> kvs =