Long story [1] short, akka opens up dynamic, random ports for each job [2].
So, simple NAT fails.  You might try some trickery with a DNS server and
docker's --net=host .


[1]
http://apache-spark-user-list.1001560.n3.nabble.com/Comprehensive-Port-Configuration-reference-tt5384.html#none
[2]
http://spark.apache.org/docs/latest/spark-standalone.html#configuring-ports-for-network-security

Jacob D. Eisinger
IBM Emerging Technologies
jeis...@us.ibm.com - (512) 286-6075



From:   Mohit Jaggi <mohitja...@gmail.com>
To:     user@spark.apache.org
Date:   06/16/2014 05:36 PM
Subject:        spark with docker: errors with akka, NAT?



Hi Folks,


I am having trouble getting spark driver running in docker. If I run a
pyspark example on my mac it works but the same example on a docker image
(Via boot2docker) fails with following logs. I am pointing the spark driver
(which is running the example) to a spark cluster (driver is not part of
the cluster). I guess this has something to do with docker's networking
stack (it may be getting NAT'd) but I am not sure why (if at all) the
spark-worker or spark-master is trying to create a new TCP connection to
the driver, instead of responding on the connection initiated by the
driver.


I would appreciate any help in figuring this out.


Thanks,


Mohit.


--------logs--------


Spark Executor Command: "java" "-cp"
"::/home/ayasdi/spark/conf:/home/xxxx/spark/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.0.4.jar"
 "-Xms2g" "-Xmx2g" "-Xms512M" "-Xmx512M"
"org.apache.spark.executor.CoarseGrainedExecutorBackend"
"akka.tcp://spark@fc31887475e3:43921/user/CoarseGrainedScheduler" "1"
"cobalt" "24" "akka.tcp://sparkWorker@aaaa:33952/user/Worker"
"app-20140616152201-0021"


========================================





log4j:WARN No appenders could be found for logger
(org.apache.hadoop.conf.Configuration).


log4j:WARN Please initialize the log4j system properly.


log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.


14/06/16 15:22:05 INFO SparkHadoopUtil: Using Spark's default log4j
profile: org/apache/spark/log4j-defaults.properties


14/06/16 15:22:05 INFO SecurityManager: Changing view acls to: ayasdi,root


14/06/16 15:22:05 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(xxx, xxx)


14/06/16 15:22:05 INFO Slf4jLogger: Slf4jLogger started


14/06/16 15:22:05 INFO Remoting: Starting remoting


14/06/16 15:22:06 INFO Remoting: Remoting started; listening on
addresses :[akka.tcp://sparkExecutor@aaaa:33536]


14/06/16 15:22:06 INFO Remoting: Remoting now listens on addresses:
[akka.tcp://sparkExecutor@aaaa:33536]


14/06/16 15:22:06 INFO CoarseGrainedExecutorBackend: Connecting to driver:
akka.tcp://spark@fc31887475e3:43921/user/CoarseGrainedScheduler


14/06/16 15:22:06 INFO WorkerWatcher: Connecting to worker
akka.tcp://sparkWorker@aaaa:33952/user/Worker


14/06/16 15:22:06 WARN Remoting: Tried to associate with unreachable remote
address [akka.tcp://spark@fc31887475e3:43921]. Address is now gated for
60000 ms, all messages to this address will be delivered to dead letters.


14/06/16 15:22:06 ERROR CoarseGrainedExecutorBackend: Driver Disassociated
[akka.tcp://sparkExecutor@aaaa:33536] ->
[akka.tcp://spark@fc31887475e3:43921] disassociated! Shutting down.

Reply via email to