That worked. Thanks!

I wonder what changed in 1.4 to cause this. It wouldn't work with anything
less than 256m for a simple piece of code.
1.3.1 used to work with default(64m I think)

Srikanth

On Wed, Jun 24, 2015 at 12:47 PM, Roberto Coluccio <
roberto.coluc...@gmail.com> wrote:

> Did you try to pass it with
>
> --driver-java-options -XX:MaxPermSize=256m
>
> as spark-shell input argument?
>
>
> Roberto
>
>
> On Wed, Jun 24, 2015 at 5:57 PM, stati <srikanth...@gmail.com> wrote:
>
>> Hello,
>>
>> I moved from 1.3.1 to 1.4.0 and started receiving
>> "java.lang.OutOfMemoryError: PermGen space"  when I use spark-shell.
>> Same Scala code works fine in 1.3.1 spark-shell. I was loading same set of
>> external JARs and have same imports in 1.3.1.
>>
>> I tried increasing perm size to 256m. I still got OOM.
>>
>> /SPARK_REPL_OPTS="-XX:MaxPermSize=256m" bin/spark-shell  --master
>> spark://machu:7077 --total-executor-cores 12  --packages
>> com.databricks:spark-csv_2.10:1.0.3 --packages joda-time:joda-time:2.8.1
>> /
>>
>> Spark UI "Environment" tab didn't show "-XX:MaxPermSize". I'm not sure if
>> this config was picked up.
>> This is standalone mode.
>>
>> Any pointers to next step?
>>
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-OutOfMemoryError-PermGen-space-tp23472.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to