es with the previous running count to get the new count
Thanks and regards
Shweta Jadhav
-VISHNU SUBRAMANIAN wrote: -
To: Jadhav Shweta
From: VISHNU SUBRAMANIAN
Date: 02/02/2015 04:39PM
Cc: "user@spark.apache.org"
Subject: Re: Java Kafka Word Count Issue
You can use updateStateB
First I would check your code to see how you are pushing records into the
topic. Is it reading the whole file each time and resending all of it?
Then see if you are using the same consumer.id on the Spark side. Otherwise
you are not reading from the same offset when restarting Spark but instead
re
> spark 1
>
> which should be spark 3.
>
> So how to handle this in Spark code.
>
> Thanks and regards
> Shweta Jadhav
>
>
>
> -----Sean Owen wrote: -
> To: Jadhav Shweta
> From: Sean Owen
> Date: 02/02/2015 04:13PM
> Subject: Re: Java Kafka Word
append spark to my log.text file
Spark program gives output as
spark 1
which should be spark 3.
So how to handle this in Spark code.
Thanks and regards
Shweta Jadhav
-Sean Owen wrote: -
To: Jadhav Shweta
From: Sean Owen
Date: 02/02/2015 04:13PM
Subject: Re: Java Kafka Word Count Issue
Hi All,
I am trying to run Kafka Word Count Program.
please find below, the link for the same
https://github.com/apache/spark/blob/master/examples/scala-2.10/src/main/java/org/apache/spark/examples/streaming/JavaKafkaWordCount.java
I have set spark master to setMaster("local[*]")
and I have sta