Re: Is there more information about spark shuffer-service

2015-07-21 Thread Ted Yu
To my knowledge, there is no HA for External Shuffle Service. 

Cheers



 On Jul 21, 2015, at 2:16 AM, JoneZhang joyoungzh...@gmail.com wrote:
 
 There is a saying If the service is enabled, Spark executors will fetch
 shuffle files from the service instead of from each other.  in the wiki
 https://spark.apache.org/docs/1.3.0/job-scheduling.html#graceful-decommission-of-executors
 https://spark.apache.org/docs/1.3.0/job-scheduling.html#graceful-decommission-of-executors
 
 
 Is there more information about shuffer-service.
 For example.
 How to deal with the service shut down, does any redundancy exists?
 
 Thanks!
 
 
 
 
 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/Is-there-more-information-about-spark-shuffer-service-tp23925.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org
 

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Is there more information about spark shuffer-service

2015-07-21 Thread JoneZhang
There is a saying If the service is enabled, Spark executors will fetch
shuffle files from the service instead of from each other.  in the wiki
https://spark.apache.org/docs/1.3.0/job-scheduling.html#graceful-decommission-of-executors
https://spark.apache.org/docs/1.3.0/job-scheduling.html#graceful-decommission-of-executors
  


Is there more information about shuffer-service.
For example.
How to deal with the service shut down, does any redundancy exists?

Thanks!




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Is-there-more-information-about-spark-shuffer-service-tp23925.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org