Re: Spark scheduling mode

2016-09-02 Thread enrico d'urso
Thank you. May I know when that comparator is called? It looks like spark scheduler has not any form of preemption, am I right? Thank you From: Mark Hamstra <m...@clearstorydata.com> Sent: Thursday, September 1, 2016 8:44:10 PM To: enrico d'urso Cc

Re: Spark scheduling mode

2016-09-01 Thread enrico d'urso
t;m...@clearstorydata.com> Sent: Thursday, September 1, 2016 8:19:44 PM To: enrico d'urso Cc: user@spark.apache.org Subject: Re: Spark scheduling mode The default pool (``) can be configured like any other pool: https://spark.apache.org/docs/latest/job-scheduling.html#configuring-pool-properties On Thu, Sep 1

Re: Spark scheduling mode

2016-09-01 Thread enrico d'urso
Is there a way to force scheduling to be fair inside the default pool? I mean, round robin for the jobs that belong to the default pool. Cheers, From: Mark Hamstra <m...@clearstorydata.com> Sent: Thursday, September 1, 2016 7:24:54 PM To: enrico d'urso Cc

Spark scheduling mode

2016-09-01 Thread enrico d'urso
I am building a Spark App, in which I submit several jobs (pyspark). I am using threads to run them in parallel, and also I am setting: conf.set("spark.scheduler.mode", "FAIR") Still, I see the jobs run serially in FIFO way. Am I missing something? Cheers, Enrico