Re: Spark scheduling mode

2016-09-02 Thread Mark Hamstra
at comparator is called? >> It looks like spark scheduler has not any form of preemption, am I right? >> >> Thank you >> -- >> *From:* Mark Hamstra <m...@clearstorydata.com> >> *Sent:* Thursday, September 1, 2016 8:44:10 PM

Re: Spark scheduling mode

2016-09-02 Thread Mark Hamstra
I right? > > Thank you > -- > *From:* Mark Hamstra <m...@clearstorydata.com> > *Sent:* Thursday, September 1, 2016 8:44:10 PM > > *To:* enrico d'urso > *Cc:* user@spark.apache.org > *Subject:* Re: Spark scheduling mode > >

Re: Spark scheduling mode

2016-09-02 Thread enrico d'urso
: user@spark.apache.org Subject: Re: Spark scheduling mode Spark's FairSchedulingAlgorithm is not round robin: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/SchedulingAlgorithm.scala#L43 When at the scope of fair scheduling Jobs within a single Pool, the Sched

Re: Spark scheduling mode

2016-09-01 Thread Mark Hamstra
eduled in round robin way, > am I right? > > -- > *From:* Mark Hamstra <m...@clearstorydata.com> > *Sent:* Thursday, September 1, 2016 8:19:44 PM > *To:* enrico d'urso > *Cc:* user@spark.apache.org > *Subject:* Re: Spark scheduling mode > > The default

Re: Spark scheduling mode

2016-09-01 Thread enrico d'urso
t;m...@clearstorydata.com> Sent: Thursday, September 1, 2016 8:19:44 PM To: enrico d'urso Cc: user@spark.apache.org Subject: Re: Spark scheduling mode The default pool (``) can be configured like any other pool: https://spark.apache.org/docs/latest/job-scheduling.html#configuring-pool-properties On Thu, Sep 1

Re: Spark scheduling mode

2016-09-01 Thread Mark Hamstra
ault pool? > I mean, round robin for the jobs that belong to the default pool. > > Cheers, > -- > *From:* Mark Hamstra <m...@clearstorydata.com> > *Sent:* Thursday, September 1, 2016 7:24:54 PM > *To:* enrico d'urso > *Cc:* user@spark.apache.org

Re: Spark scheduling mode

2016-09-01 Thread enrico d'urso
: user@spark.apache.org Subject: Re: Spark scheduling mode Just because you've flipped spark.scheduler.mode to FAIR, that doesn't mean that Spark can magically configure and start multiple scheduling pools for you, nor can it know to which pools you want jobs assigned. Without doing any

Re: Spark scheduling mode

2016-09-01 Thread Mark Hamstra
Just because you've flipped spark.scheduler.mode to FAIR, that doesn't mean that Spark can magically configure and start multiple scheduling pools for you, nor can it know to which pools you want jobs assigned. Without doing any setup of additional scheduling pools or assigning of jobs to pools,

Spark scheduling mode

2016-09-01 Thread enrico d'urso
I am building a Spark App, in which I submit several jobs (pyspark). I am using threads to run them in parallel, and also I am setting: conf.set("spark.scheduler.mode", "FAIR") Still, I see the jobs run serially in FIFO way. Am I missing something? Cheers, Enrico