This is because setLocalProperty makes all Spark jobs submitted using
the current thread belong to the set pool. However, in Spark
Streaming, all the jobs are actually launched in the background from a
different thread. So this setting does not work. However,  there is a
work around. If you are doing any kind of output operations on
DStreams, like DStream.foreachRDD(), you can set the property inside
that

dstream.foreachRDD(rdd =>
   rdd.sparkContext.setLocalProperty(...)
)



On Wed, Jul 30, 2014 at 1:43 AM, liuwei <stupi...@126.com> wrote:
> In my spark streaming program, I set scheduler pool, just as follows:
>
> val myFairSchedulerFile = “xxx.xml”
> val myStreamingPool = “xxx”
>
> System.setProperty(“spark.scheduler.allocation.file”, myFairSchedulerFile)
> val conf = new SparkConf()
> val ssc = new StreamingContext(conf, batchInterval)
> ssc.sparkContext.setLocalProperty(“spark.scheduler.pool”, myStreamingPool)
> ….
> ssc.start()
> ssc.awaitTermination()
>
> I submit my spark streaming job in my spark cluster, and I found stage’s pool 
> name is “default”, it seem 
> ssc.sparkContext.setLocalProperty(“spark.scheduler.pool”, myStreamingPool) 
> not work.

Reply via email to