I am using cutting edge code from git but doing my own sbt assembly.
On Mon, Jun 16, 2014 at 10:28 PM, Andre Schumacher < schum...@icsi.berkeley.edu> wrote: > > Hi, > > are you using the amplab/spark-1.0.0 images from the global registry? > > Andre > > On 06/17/2014 01:36 AM, Mohit Jaggi wrote: > > Hi Folks, > > > > I am having trouble getting spark driver running in docker. If I run a > > pyspark example on my mac it works but the same example on a docker image > > (Via boot2docker) fails with following logs. I am pointing the spark > driver > > (which is running the example) to a spark cluster (driver is not part of > > the cluster). I guess this has something to do with docker's networking > > stack (it may be getting NAT'd) but I am not sure why (if at all) the > > spark-worker or spark-master is trying to create a new TCP connection to > > the driver, instead of responding on the connection initiated by the > driver. > > > > I would appreciate any help in figuring this out. > > > > Thanks, > > > > Mohit. > > > > --------logs-------- > > > > Spark Executor Command: "java" "-cp" > > > "::/home/ayasdi/spark/conf:/home/xxxx/spark/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.0.4.jar" > > "-Xms2g" "-Xmx2g" "-Xms512M" "-Xmx512M" > > "org.apache.spark.executor.CoarseGrainedExecutorBackend" > > "akka.tcp://spark@fc31887475e3:43921/user/CoarseGrainedScheduler" "1" > > "cobalt" "24" "akka.tcp://sparkWorker@aaaa:33952/user/Worker" > > "app-20140616152201-0021" > > > > ======================================== > > > > > > log4j:WARN No appenders could be found for logger > > (org.apache.hadoop.conf.Configuration). > > > > log4j:WARN Please initialize the log4j system properly. > > > > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for > > more info. > > > > 14/06/16 15:22:05 INFO SparkHadoopUtil: Using Spark's default log4j > > profile: org/apache/spark/log4j-defaults.properties > > > > 14/06/16 15:22:05 INFO SecurityManager: Changing view acls to: > ayasdi,root > > > > 14/06/16 15:22:05 INFO SecurityManager: SecurityManager: authentication > > disabled; ui acls disabled; users with view permissions: Set(xxx, xxx) > > > > 14/06/16 15:22:05 INFO Slf4jLogger: Slf4jLogger started > > > > 14/06/16 15:22:05 INFO Remoting: Starting remoting > > > > 14/06/16 15:22:06 INFO Remoting: Remoting started; listening on addresses > > :[akka.tcp://sparkExecutor@aaaa:33536] > > > > 14/06/16 15:22:06 INFO Remoting: Remoting now listens on addresses: > > [akka.tcp://sparkExecutor@aaaa:33536] > > > > 14/06/16 15:22:06 INFO CoarseGrainedExecutorBackend: Connecting to > driver: > > akka.tcp://spark@fc31887475e3:43921/user/CoarseGrainedScheduler > > > > 14/06/16 15:22:06 INFO WorkerWatcher: Connecting to worker > > akka.tcp://sparkWorker@aaaa:33952/user/Worker > > > > 14/06/16 15:22:06 WARN Remoting: Tried to associate with unreachable > remote > > address [akka.tcp://spark@fc31887475e3:43921]. Address is now gated for > > 60000 ms, all messages to this address will be delivered to dead letters. > > > > 14/06/16 15:22:06 ERROR CoarseGrainedExecutorBackend: Driver > Disassociated > > [akka.tcp://sparkExecutor@aaaa:33536] -> [akka.tcp://spark@fc31887475e3 > :43921] > > disassociated! Shutting down. > > > >