I am not able to stop Spark-streaming job.
Let me explain briefly
* getting data from Kafka topic
* splitting data to create a JavaRDD
* mapping the JavaRDD to JavaPairRDD to do a reduceByKey transformation
* writing the JavaPairRDD into the C* DB // something going wrong here
the message
I am using spark streaming for a basic streaming movie count program.
So I first I have mapped the year and movie name to a JavaPairRDD and
I am using the reduceByKey cor counting the movie year wise.
I am using cassandra for output, the spark streaming application is not
stopping and the
code:
directKafkaStream.foreachRDD(rdd ->
{
rdd.foreach(record ->
{
messages1.add(record._2);
});
JavaRDD lines = sc.parallelize(messages1);
code:
directKafkaStream.foreachRDD(rdd ->
{
rdd.foreach(record ->
{
messages1.add(record._2);
});
JavaRDD lines = sc.parallelize(messages1);
Error in the highlighted line. Code, error and pom.xml included below
code :
final Session session = connector.openSession();
final PreparedStatement prepared = session.prepare("INSERT INTO
spark_test5.messages JSON?");
JavaStreamingContext ssc = new
Hi ,
I got the error below when executed
Exception in thread "main" java.lang.NoSuchMethodError:
scala.runtime.ObjectRef.zero()Lscala/runtime/ObjectRef;
error in detail:
Exception in thread "main" java.lang.NoSuchMethodError:
scala.runtime.ObjectRef.zero()Lscala/runtime/ObjectRef;
at
How do I take input from Apache Kafka into Apache Spark Streaming for
stream processing ?
-sathya