Hi
changed code like this but error continues:
myUnionRdd.repartition(sparkNumberOfSlaves).foreachRDD(
new Function, Void>() {
private static final long serialVersionUID = 1L;
@Override
public Void call(JavaPairRDD v1)
throws Exception {
myFunction() is probably capturing unexpected things in the closure of the
Function you have defined, because myFunction is defined outside. Try
defining the myFunction inside the Function and see if the problem persists.
On Thu, Jun 9, 2016 at 3:57 AM, sandesh deshmane
Hi,
I am using spark streaming for streaming data from kafka 0.8
I am using checkpointing in HDFS . I am getting error like below
java.io.NotSerializableException: DStream checkpointing has been enabled
but the DStreams with their functions are not serialisable
field (class: