Thanks a lot. It worked after keeping all versions to same.1.2.0
On Wed, Jun 24, 2015 at 2:24 AM, Tathagata Das wrote:
> Why are you mixing spark versions between streaming and core??
> Your core is 1.2.0 and streaming is 1.4.0.
>
> On Tue, Jun 23, 2015 at 1:32 PM, Shushant Arora > wrote:
>
>>
Why are you mixing spark versions between streaming and core??
Your core is 1.2.0 and streaming is 1.4.0.
On Tue, Jun 23, 2015 at 1:32 PM, Shushant Arora
wrote:
> It throws exception for WriteAheadLogUtils after excluding core and
> streaming jar.
>
> Exception in thread "main" java.lang.NoClass
It throws exception for WriteAheadLogUtils after excluding core and
streaming jar.
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark/streaming/util/WriteAheadLogUtils$
at
org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:84)
at
org
You must not include spark-core and spark-streaming in the assembly. They
are already present in the installation and the presence of multiple
versions of spark may throw off the classloaders in weird ways. So make the
assembly marking the those dependencies as scope=provided.
On Tue, Jun 23, 20
hi
While using spark streaming (1.2) with kafka . I am getting below error
and receivers are getting killed but jobs get scheduled at each stream
interval.
15/06/23 18:42:35 WARN TaskSetManager: Lost task 0.1 in stage 18.0 (TID 82,
ip(XX)): java.io.IOException: Failed to connect to ip(XXX