1. Could you share your Spark version?
2. Could you reduce "spark.sql.ui.retainedExecutions" and see whether it
helps? This configuration is available in 2.3.0, and default value is 1000.

Thanks,
Jungtaek Lim (HeartSaVioR)

2018년 5월 22일 (화) 오후 4:29, weand <andreas.we...@gmail.com>님이 작성:

> You can see it even better on this screenshot:
>
> TOP Entries Collapsed #2
> <http://apache-spark-user-list.1001560.n3.nabble.com/file/t8542/27_001.png>
>
>
> Sorry for the spam, attached a not so perfect screen in the mail before.
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to