Thank you Tathagata and Therry for your response. You guys were absolutely
correct that I created a dummy Dstream (to prevent Flume channel filling
up) and counted the messages but I didn't output(print), hence is why it
reported that error. Since I called print(), the error is no longer is
being
Hello all,
My Stream job is throwing below exception at every interval. It is first
deleting the the checkpoint file and then it's trying to checkpoint, is
this normal behaviour? I'm using Spark 1.3.0. Do you know what may cause
this issue?
15/09/24 16:35:55 INFO scheduler.TaskSetManager:
Are you by any chance setting DStream.remember() with null?
On Thu, Sep 24, 2015 at 5:02 PM, Uthayan Suthakar <
uthayan.sutha...@gmail.com> wrote:
> Hello all,
>
> My Stream job is throwing below exception at every interval. It is first
> deleting the the checkpoint file and then it's trying to
I met this before: in my program, some DStreams are not initialized since
they are not in the path of of output.
You can check if you are the same case.
Thanks!
- Terry
On Fri, Sep 25, 2015 at 10:22 AM, Tathagata Das wrote:
> Are you by any chance setting