Could you show how did you set the configurations? You need to set these
configurations before creating SparkContext and SQLContext.

Moreover, the history sever doesn't support SQL UI. So
"spark.eventLog.enabled=true" doesn't work now.

Best Regards,
Shixiong Zhu

2015-10-13 2:01 GMT+08:00 pnpritchard <nicholas.pritch...@falkonry.com>:

> Hi,
>
> In my application, the Spark UI is consuming a lot of memory, especially
> the
> SQL tab. I have set the following configurations to reduce the memory
> consumption:
> - spark.ui.retainedJobs=20
> - spark.ui.retainedStages=40
> - spark.sql.ui.retainedExecutions=0
>
> However, I still get OOM errors in the driver process with the default 1GB
> heap size. The following link is a screen shot of a heap dump report,
> showing the SQLListener instance having a retained size of 600MB.
>
> https://cloud.githubusercontent.com/assets/5124612/10404379/20fbdcfc-6e87-11e5-9415-27e25193a25c.png
>
> Rather than just increasing the allotted heap size, does anyone have any
> other ideas? Is it possible to disable the SQL tab specifically? I also
> thought about serving the UI from disk rather than memory with
> "spark.eventLog.enabled=true" and "spark.ui.enabled=false". Has anyone
> tried
> this before?
>
> Thanks,
> Nick
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-UI-consuming-lots-of-memory-tp25033.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to