Hi,

I am trying to read data from kafka and process it using spark.
i have attached my source code , error log.

For integrating kafka,
i have added dependency in pom.xml

<dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>1.3.0</version>
 </dependency>

 <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka_2.10</artifactId>
            <version>1.3.0</version>
 </dependency>

i have attached  full error log.please check it why it is giving the error
. this class exits in my class path.
I am running spark and kafka locally using java class.

SparkConf conf = new SparkConf().setAppName("Spark Demo").setMaster(
                "local[2]").set("spark.executor.memory", "1g");

I

[image: Inline image 2]

Attachment: spark-error.log
Description: Binary data

Attachment: kafka-spark.java
Description: Binary data

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to