Hi All,
I'm running Spark Streaming (Python) with Direct Kafka and I'm seeing that
the memory usage will slowly go up and eventually kill the job in a few
days.
Everything runs fine at first but after a few days the job started issuing
*error: [Errno 104] Connection reset by peer , *followed by
bq. lambda part: save_sets(part, KEY_SET_NAME,
Where do you save the part to ?
For OutOfMemoryError, the last line was from Utility.scala
Anything before that ?
Thanks
On Thu, Dec 3, 2015 at 11:47 AM, Augustus Hong
wrote:
> Hi All,
>
> I'm running Spark Streaming