Hi All,
         I am running 3 executors in my spark streaming application with 3
cores per executors. I have written my custom receiver for receiving network
data.

In my current configuration I am launching 3 receivers , one receiver per
executor.

In the run if 2 of my executor dies, I am left with only one executor and
all 3 receivers are scheduled on that executor. Since this executor has only
3 cores and all cores are busy running 3 receivers, Action on accumulated
window data(DStream) is not scheduled and my application hangs.

Is there a way to restrict number of receivers per executor so that I am
always left with some core to run action on DStream.

Thanks    



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Limiting-number-of-receivers-per-executor-tp26192.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to