Hi Sim,

Spark 1.4.0's memory consumption on PermGen is higher then Spark 1.3
(explained in https://issues.apache.org/jira/browse/SPARK-8776). Can you
add --conf "spark.driver.extraJavaOptions=-XX:MaxPermSize=256m" in the
command you used to launch Spark shell? This will increase the PermGen size
from 128m (our default) to 256m.

Thanks,

Yin

On Thu, Jul 2, 2015 at 12:40 PM, sim <s...@swoop.com> wrote:

> A very simple Spark SQL COUNT operation succeeds in spark-shell for 1.3.1
> and
> fails with a series of out-of-memory errors in 1.4.0.
>
> This gist <https://gist.github.com/ssimeonov/a49b75dc086c3ac6f3c4>
> includes the code and the full output from the 1.3.1 and 1.4.0 runs,
> including the command line showing how spark-shell is started.
>
> Should the 1.4.0 spark-shell be started with different options to avoid
> this
> problem?
>
> Thanks,
> Sim
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/1-4-0-regression-out-of-memory-errors-on-small-data-tp23595.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to