I have the same problem when I upgrade my application from Spark 2.2.1 to 
Spark 2.3.2 and run in Yarn client mode.
Also I noticed that in my Spark driver,  org.apache.spark.status.TaskDataWrapper
could take up more than 2G of memory. 

Shing


    On Tuesday, 16 October 2018, 17:34:02 GMT+1, Patrick Brown 
<patrick.barry.br...@gmail.com> wrote:  
 
 I recently upgraded to spark 2.3.1 I have had these same settings in my spark 
submit script, which worked on 2.0.2, and according to the documentation appear 
to not have changed:
spark.ui.retainedTasks=1spark.ui.retainedStages=1spark.ui.retainedJobs=1
However in 2.3.1 the UI doesn't seem to respect this, it still retains a huge 
number of jobs:



Is this a known issue? Any ideas?  
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to