s
onto the driver. I could increase the driver memory, but this wouldn't help
if saveAsParquet then decided to pull in 100 tasks at a time.
Is there a way to avoid this OOM error?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/out-of-memory-error-with-Parquet-t
this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/out-of-memory-error-with-Parquet-tp25381p25382.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user
rk jiras, it seems that
> Parquet is known to have some memory issues with buffering and writing, and
> that at least some were resolved in Spark 1.5.0.
>
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/out-of-