Please refer to:
https://spark.apache.org/docs/latest/running-on-yarn.html

You can setup spark.yarn.am.nodeLabelExpression and
spark.yarn.executor.nodeLabelExpression corresponding to the 2 machines.

On Wed, May 4, 2016 at 3:03 AM, Shams ul Haque <sham...@cashcare.in> wrote:

> Hi,
>
> I have a cluster of 4 machines for Spark. I want my Spark app to run on 2
> machines only. And rest 2 machines for other Spark apps.
> So my question is, can I restrict my app to run on that 2 machines only by
> passing some IP at the time of setting SparkConf or by any other setting?
>
>
> Thanks,
> Shams
>

Reply via email to