Hello,

I moved from 1.3.1 to 1.4.0 and started receiving
"java.lang.OutOfMemoryError: PermGen space"  when I use spark-shell. 
Same Scala code works fine in 1.3.1 spark-shell. I was loading same set of
external JARs and have same imports in 1.3.1.

I tried increasing perm size to 256m. I still got OOM.

/SPARK_REPL_OPTS="-XX:MaxPermSize=256m" bin/spark-shell  --master
spark://machu:7077 --total-executor-cores 12  --packages
com.databricks:spark-csv_2.10:1.0.3 --packages joda-time:joda-time:2.8.1
/

Spark UI "Environment" tab didn't show "-XX:MaxPermSize". I'm not sure if
this config was picked up.
This is standalone mode.

Any pointers to next step?





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-OutOfMemoryError-PermGen-space-tp23472.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to