on yarn it is impossible afaik. on kubernetes you can use taints to keep
certain nodes outside of spark

On Fri, Jan 18, 2019 at 9:35 PM Felix Cheung <felixcheun...@hotmail.com>
wrote:

> Not as far as I recall...
>
>
> ------------------------------
> *From:* Serega Sheypak <serega.shey...@gmail.com>
> *Sent:* Friday, January 18, 2019 3:21 PM
> *To:* user
> *Subject:* Spark on Yarn, is it possible to manually blacklist nodes
> before running spark job?
>
> Hi, is there any possibility to tell Scheduler to blacklist specific nodes
> in advance?
>

Reply via email to