The error doesn't say you're out of memory, but says you're out of PermGen.
If you see this, you aren't running Java 8 AFAIK, because 8 has no PermGen.
But if you're running Java 7, and you go investigate what this error means,
you'll find you need to increase PermGen. This is mentioned in the Spark
docs too, as you need to increase this when building on Java 7.

On Thu, Oct 13, 2016 at 10:00 AM Shady Xu <shad...@gmail.com> wrote:

> Hi,
>
> I have a problem when running Spark SQL by PySpark on Java 8. Below is the
> log.
>
>
> 16/10/13 16:46:40 INFO spark.SparkContext: Starting job: sql at 
> NativeMethodAccessorImpl.java:-2
> Exception in thread "dag-scheduler-event-loop" java.lang.OutOfMemoryError: 
> PermGen space
>       at java.lang.ClassLoader.defineClass1(Native Method)
>       at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>       at 
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>       at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>       at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>       at 
> org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:857)
>       at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1630)
>       at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1622)
>       at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1611)
>       at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
> Exception in thread "shuffle-server-2" java.lang.OutOfMemoryError: PermGen 
> space
> Exception in thread "shuffle-server-4" java.lang.OutOfMemoryError: PermGen 
> space
> Exception in thread "threadDeathWatcher-2-1" java.lang.OutOfMemoryError: 
> PermGen space
>
>
> I tried to increase the driver memory and didn't help. However, things are ok 
> when I run the same code after switching to Java 7. I also find it ok to run 
> the SparkPi example on Java 8. So I believe the problem stays with PySpark 
> rather theSpark core.
>
>
> I am using Spark 2.0.1 and run the program in YARN cluster mode. Anyone any 
> idea is appreciated.
>
>

Reply via email to