did you try calling rdd.unpersist()

On Wed, Oct 18, 2017 at 10:04 AM, Mina Aslani <aslanim...@gmail.com> wrote:

> Hi,
>
> I get "No space left on device" error in my spark worker:
>
> Error writing stream to file /usr/spark-2.2.0/work/app-......./0/stderr
> java.io.IOException: No space left on device
>
> In my spark cluster, I have one worker and one master.
> My program consumes stream of data from kafka and publishes the result
> into kafka. I set my RDD = null after I finish working, so that
> intermediate shuffle files are removed quickly.
>
> How can I avoid "No space left on device"?
>
> Best regards,
> Mina
>



-- 
I.R

Reply via email to