Hi, 
I'm using Spark 1.6.0, and according to the documentation, dynamic
allocation and spark shuffle service should be enabled.

When I submit a spark job via the following:

spark-submit \
--master **** \
--deploy-mode cluster \
--executor-cores 3 \
--conf "spark.streaming.backpressure.enabled=true" \
--conf "spark.dynamicAllocation.enabled=true" \
--conf "spark.dynamicAllocation.minExecutors=2" \
--conf "spark.dynamicAllocation.maxExecutors=24" \
--conf "spark.shuffle.service.enabled=true" \
--conf "spark.executor.memory=8g" \
--conf "spark.driver.memory=10g" \
--class SparkJobRunner
/opt/clicktale/entityCreator/com.clicktale.ai.entity-creator-assembly-0.0.2.jar

I'm seeing error logs from the workers being unable to connect to the
shuffle service:

16/03/08 17:33:15 ERROR storage.BlockManager: Failed to connect to external
shuffle server, will retry 2 more times after waiting 5 seconds...
java.io.IOException: Failed to connect to ****
        at
org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:216)
        at
org.apache.spark.network.client.TransportClientFactory.createUnmanagedClient(TransportClientFactory.java:181)
        at
org.apache.spark.network.shuffle.ExternalShuffleClient.registerWithShuffleServer(ExternalShuffleClient.java:141)
        at
org.apache.spark.storage.BlockManager$$anonfun$registerWithExternalShuffleServer$1.apply$mcVI$sp(BlockManager.scala:211)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
        at
org.apache.spark.storage.BlockManager.registerWithExternalShuffleServer(BlockManager.scala:208)
        at 
org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:194)
        at org.apache.spark.executor.Executor.<init>(Executor.scala:85)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receive$1.applyOrElse(CoarseGrainedExecutorBackend.scala:83)
        at
org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:116)
        at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:204)
        at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
        at
org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:215)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

I verified all relevant ports are open. Has anyone else experienced such a
failure?

Yuval.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Using-dynamic-allocation-and-shuffle-service-in-Standalone-Mode-tp26430.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to