Re: OOM when running Spark SQL by PySpark on Java 8

2016-10-13 Thread Shady Xu
You should set > MaxPermSize if anything, not PermSize. However the error indicates you are > not using Java 8 everywhere on your cluster, and that's a potentially > bigger problem. > > On Thu, Oct 13, 2016 at 10:26 AM Shady Xu <shad...@gmail.com> wrote: > >> Solved the problem b

Re: OOM when running Spark SQL by PySpark on Java 8

2016-10-13 Thread Shady Xu
to specify them when submitting the Spark job, which is wried. I don't know whether it has anything to do with py4j as I am not familiar with it. 2016-10-13 17:00 GMT+08:00 Shady Xu <shad...@gmail.com>: > Hi, > > I have a problem when running Spark SQL by PySpark on Java 8. Below is

OOM when running Spark SQL by PySpark on Java 8

2016-10-13 Thread Shady Xu
Hi, I have a problem when running Spark SQL by PySpark on Java 8. Below is the log. 16/10/13 16:46:40 INFO spark.SparkContext: Starting job: sql at NativeMethodAccessorImpl.java:-2 Exception in thread "dag-scheduler-event-loop" java.lang.OutOfMemoryError: PermGen space at

Re: How to display the web ui when running Spark on YARN?

2016-03-09 Thread Shady Xu
, but then it reported /bin/yarn not found. Seems the installation and configuration of CDH distribution has something different with the Apache one. 2016-03-04 18:27 GMT+08:00 Steve Loughran <ste...@hortonworks.com>: > > On 3 Mar 2016, at 09:17, Shady Xu <shad...@gmail.com> wrote: &g

How to display the web ui when running Spark on YARN?

2016-03-03 Thread Shady Xu
Hi all, I am running Spark in yarn-client mode, but every time I access the web ui, the browser redirect me to one of the worker nodes and shows nothing. The url looks like http://hadoop-node31.company.com:8088/proxy/application_1453797301246_120264 . I googled a lot and found some possible