Hi,

No, currently you can't change the setting. 

// maropu



2016/08/27 11:40、Vadim Semenov <vadim.seme...@datadoghq.com> のメッセージ:

> Hi spark users,
> 
> I wonder if it's possible to change executors settings on-the-fly.
> I have the following use-case: I have a lot of non-splittable skewed files in 
> a custom format that I read using a custom Hadoop RecordReader. These files 
> can be small & huge and I'd like to use only one-two cores per executor while 
> they get processed (to use the whole heap). But once they got processed I'd 
> like to enable all cores.
> I know that I can achieve this by splitting it into two separate jobs but I 
> wonder if it's possible to somehow achieve the behavior I described.
> 
> Thanks!

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to