Hi Anton,

       Spark Pools / Spark Fair Scheduler is scheduling the tasks within a
Spark Job. Each Spark job will have multiple stages and each stage will
have multiple tasks.
This is different from YARN Fair Scheduler which schedules the jobs
submitted to YARN Cluster. Spark Pools within a Spark Job will work on YARN
Cluster as well.

Thanks,
Prabhu Joseph

On Tue, Feb 26, 2019 at 11:53 AM Anton Puzanov <antonpuzdeve...@gmail.com>
wrote:

> Hi everyone,
>
> Spark supports in application, job concurrency execution by using pools
> and Spark's Fair scheduler (different than Yarn's Fair scheduler).
> link:
> https://spark.apache.org/docs/latest/job-scheduling.html#scheduling-within-an-application
>
> Is this feature supported when Yarn is used as a cluster manager? Are
> there special configurations I have to set or common pitfalls I need to be
> aware of?
>
> Thanks,
> Anton
>
>

Reply via email to