For some reason you are having two different versions of spark jars in your
classpath.

Thanks
Best Regards

On Tue, Aug 4, 2015 at 12:37 PM, Deepesh Maheshwari <
deepesh.maheshwar...@gmail.com> wrote:

> Hi,
>
> I am trying to read data from kafka and process it using spark.
> i have attached my source code , error log.
>
> For integrating kafka,
> i have added dependency in pom.xml
>
> <dependency>
>             <groupId>org.apache.spark</groupId>
>             <artifactId>spark-streaming_2.10</artifactId>
>             <version>1.3.0</version>
>  </dependency>
>
>  <dependency>
>             <groupId>org.apache.spark</groupId>
>             <artifactId>spark-streaming-kafka_2.10</artifactId>
>             <version>1.3.0</version>
>  </dependency>
>
> i have attached  full error log.please check it why it is giving the error
> . this class exits in my class path.
> I am running spark and kafka locally using java class.
>
> SparkConf conf = new SparkConf().setAppName("Spark Demo").setMaster(
>                 "local[2]").set("spark.executor.memory", "1g");
>
> I
>
> [image: Inline image 2]
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

Reply via email to