Hi,
I'm trying to figure out how to write lots of data from each worker. I tried 
rdd.saveAsTextFile but got OOM when generating 1024MB string for a worker. 
Increasing worker memory would mean that I should drop the number of workers. 
Soo, any idea how to write ex. 1gb file from each worker?

cheers,
-jan
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to