Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond
For the Spark Streaming app, if you want a particular action inside a foreachRDD to go to a particular pool, then make sure you set the pool within the foreachRDD function. E.g. dstream.foreachRDD { rdd => rdd.sparkContext.setLocalProperty("spark.scheduler.pool", "pool1") // set the pool rdd.count() // or whatever job } This will ensure that the jobs will be allocated to the desired pool. LMK if this works. TD On Fri, May 15, 2015 at 11:26 AM, Richard Marscher wrote: > It's not a Spark Streaming app, so sorry I'm not sure of the answer to > that. I would assume it should work. > > On Fri, May 15, 2015 at 2:22 PM, Evo Eftimov > wrote: > >> Ok thanks a lot for clarifying that – btw was your application a Spark >> Streaming App – I am also looking for confirmation that FAIR scheduling is >> supported for Spark Streaming Apps >> >> >> >> *From:* Richard Marscher [mailto:rmarsc...@localytics.com] >> *Sent:* Friday, May 15, 2015 7:20 PM >> *To:* Evo Eftimov >> *Cc:* Tathagata Das; user >> *Subject:* Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond >> >> >> >> The doc is a bit confusing IMO, but at least for my application I had to >> use a fair pool configuration to get my stages to be scheduled with FAIR. >> >> >> >> On Fri, May 15, 2015 at 2:13 PM, Evo Eftimov >> wrote: >> >> No pools for the moment – for each of the apps using the straightforward >> way with the spark conf param for scheduling = FAIR >> >> >> >> Spark is running in a Standalone Mode >> >> >> >> Are you saying that Configuring Pools is mandatory to get the FAIR >> scheduling working – from the docs it seemed optional to me >> >> >> >> *From:* Tathagata Das [mailto:t...@databricks.com] >> *Sent:* Friday, May 15, 2015 6:45 PM >> *To:* Evo Eftimov >> *Cc:* user >> *Subject:* Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond >> >> >> >> How are you configuring the fair scheduler pools? >> >> >> >> On Fri, May 15, 2015 at 8:33 AM, Evo Eftimov >> wrote: >> >> I have run / submitted a few Spark Streaming apps configured with Fair >> scheduling on Spark Streaming 1.2.0, however they still run in a FIFO >> mode. >> Is FAIR scheduling supported at all for Spark Streaming apps and from what >> release / version - e.g. 1.3.1 >> >> >> >> >> -- >> View this message in context: >> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Fair-Scheduler-for-Spark-Streaming-1-2-and-beyond-tp22902.html >> Sent from the Apache Spark User List mailing list archive at Nabble.com. >> >> - >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >> For additional commands, e-mail: user-h...@spark.apache.org >> >> >> >> >> > >
Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond
If you don't send jobs to different pools, then they will all end up in the default pool. If you leave the intra-pool scheduling policy as the default FIFO, then this will effectively be the same thing as using the default FIFO scheduling. Depending on what you are trying to accomplish, you need some combination of multiple pools and FAIR scheduling within one or more pools. And. of course, you need to actually place a job within an appropriate pool. On Fri, May 15, 2015 at 11:13 AM, Evo Eftimov wrote: > No pools for the moment – for each of the apps using the straightforward > way with the spark conf param for scheduling = FAIR > > > > Spark is running in a Standalone Mode > > > > Are you saying that Configuring Pools is mandatory to get the FAIR > scheduling working – from the docs it seemed optional to me > > > > *From:* Tathagata Das [mailto:t...@databricks.com] > *Sent:* Friday, May 15, 2015 6:45 PM > *To:* Evo Eftimov > *Cc:* user > *Subject:* Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond > > > > How are you configuring the fair scheduler pools? > > > > On Fri, May 15, 2015 at 8:33 AM, Evo Eftimov > wrote: > > I have run / submitted a few Spark Streaming apps configured with Fair > scheduling on Spark Streaming 1.2.0, however they still run in a FIFO mode. > Is FAIR scheduling supported at all for Spark Streaming apps and from what > release / version - e.g. 1.3.1 > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Fair-Scheduler-for-Spark-Streaming-1-2-and-beyond-tp22902.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > > >
Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond
It's not a Spark Streaming app, so sorry I'm not sure of the answer to that. I would assume it should work. On Fri, May 15, 2015 at 2:22 PM, Evo Eftimov wrote: > Ok thanks a lot for clarifying that – btw was your application a Spark > Streaming App – I am also looking for confirmation that FAIR scheduling is > supported for Spark Streaming Apps > > > > *From:* Richard Marscher [mailto:rmarsc...@localytics.com] > *Sent:* Friday, May 15, 2015 7:20 PM > *To:* Evo Eftimov > *Cc:* Tathagata Das; user > *Subject:* Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond > > > > The doc is a bit confusing IMO, but at least for my application I had to > use a fair pool configuration to get my stages to be scheduled with FAIR. > > > > On Fri, May 15, 2015 at 2:13 PM, Evo Eftimov > wrote: > > No pools for the moment – for each of the apps using the straightforward > way with the spark conf param for scheduling = FAIR > > > > Spark is running in a Standalone Mode > > > > Are you saying that Configuring Pools is mandatory to get the FAIR > scheduling working – from the docs it seemed optional to me > > > > *From:* Tathagata Das [mailto:t...@databricks.com] > *Sent:* Friday, May 15, 2015 6:45 PM > *To:* Evo Eftimov > *Cc:* user > *Subject:* Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond > > > > How are you configuring the fair scheduler pools? > > > > On Fri, May 15, 2015 at 8:33 AM, Evo Eftimov > wrote: > > I have run / submitted a few Spark Streaming apps configured with Fair > scheduling on Spark Streaming 1.2.0, however they still run in a FIFO mode. > Is FAIR scheduling supported at all for Spark Streaming apps and from what > release / version - e.g. 1.3.1 > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Fair-Scheduler-for-Spark-Streaming-1-2-and-beyond-tp22902.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > > > > >
RE: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond
Ok thanks a lot for clarifying that – btw was your application a Spark Streaming App – I am also looking for confirmation that FAIR scheduling is supported for Spark Streaming Apps From: Richard Marscher [mailto:rmarsc...@localytics.com] Sent: Friday, May 15, 2015 7:20 PM To: Evo Eftimov Cc: Tathagata Das; user Subject: Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond The doc is a bit confusing IMO, but at least for my application I had to use a fair pool configuration to get my stages to be scheduled with FAIR. On Fri, May 15, 2015 at 2:13 PM, Evo Eftimov wrote: No pools for the moment – for each of the apps using the straightforward way with the spark conf param for scheduling = FAIR Spark is running in a Standalone Mode Are you saying that Configuring Pools is mandatory to get the FAIR scheduling working – from the docs it seemed optional to me From: Tathagata Das [mailto:t...@databricks.com] Sent: Friday, May 15, 2015 6:45 PM To: Evo Eftimov Cc: user Subject: Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond How are you configuring the fair scheduler pools? On Fri, May 15, 2015 at 8:33 AM, Evo Eftimov wrote: I have run / submitted a few Spark Streaming apps configured with Fair scheduling on Spark Streaming 1.2.0, however they still run in a FIFO mode. Is FAIR scheduling supported at all for Spark Streaming apps and from what release / version - e.g. 1.3.1 -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Fair-Scheduler-for-Spark-Streaming-1-2-and-beyond-tp22902.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond
The doc is a bit confusing IMO, but at least for my application I had to use a fair pool configuration to get my stages to be scheduled with FAIR. On Fri, May 15, 2015 at 2:13 PM, Evo Eftimov wrote: > No pools for the moment – for each of the apps using the straightforward > way with the spark conf param for scheduling = FAIR > > > > Spark is running in a Standalone Mode > > > > Are you saying that Configuring Pools is mandatory to get the FAIR > scheduling working – from the docs it seemed optional to me > > > > *From:* Tathagata Das [mailto:t...@databricks.com] > *Sent:* Friday, May 15, 2015 6:45 PM > *To:* Evo Eftimov > *Cc:* user > *Subject:* Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond > > > > How are you configuring the fair scheduler pools? > > > > On Fri, May 15, 2015 at 8:33 AM, Evo Eftimov > wrote: > > I have run / submitted a few Spark Streaming apps configured with Fair > scheduling on Spark Streaming 1.2.0, however they still run in a FIFO mode. > Is FAIR scheduling supported at all for Spark Streaming apps and from what > release / version - e.g. 1.3.1 > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Fair-Scheduler-for-Spark-Streaming-1-2-and-beyond-tp22902.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > > >
RE: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond
No pools for the moment – for each of the apps using the straightforward way with the spark conf param for scheduling = FAIR Spark is running in a Standalone Mode Are you saying that Configuring Pools is mandatory to get the FAIR scheduling working – from the docs it seemed optional to me From: Tathagata Das [mailto:t...@databricks.com] Sent: Friday, May 15, 2015 6:45 PM To: Evo Eftimov Cc: user Subject: Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond How are you configuring the fair scheduler pools? On Fri, May 15, 2015 at 8:33 AM, Evo Eftimov wrote: I have run / submitted a few Spark Streaming apps configured with Fair scheduling on Spark Streaming 1.2.0, however they still run in a FIFO mode. Is FAIR scheduling supported at all for Spark Streaming apps and from what release / version - e.g. 1.3.1 -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Fair-Scheduler-for-Spark-Streaming-1-2-and-beyond-tp22902.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Spark Fair Scheduler for Spark Streaming - 1.2 and beyond
How are you configuring the fair scheduler pools? On Fri, May 15, 2015 at 8:33 AM, Evo Eftimov wrote: > I have run / submitted a few Spark Streaming apps configured with Fair > scheduling on Spark Streaming 1.2.0, however they still run in a FIFO mode. > Is FAIR scheduling supported at all for Spark Streaming apps and from what > release / version - e.g. 1.3.1 > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Fair-Scheduler-for-Spark-Streaming-1-2-and-beyond-tp22902.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >