[ https://issues.apache.org/jira/browse/SPARK-11143?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14960420#comment-14960420 ]
Klaus Ma edited comment on SPARK-11143 at 10/16/15 9:24 AM: ------------------------------------------------------------ I addressed the issue by a new docker image which is more about environment; I still suggest to provide parameters to simplify the docker configuration. was (Author: klaus1982): I addressed the issue by a new docker image which is more environment; but I still suggest to provide parameters to simplify the configuration. > SparkMesosDispatcher can not launch driver in docker > ---------------------------------------------------- > > Key: SPARK-11143 > URL: https://issues.apache.org/jira/browse/SPARK-11143 > Project: Spark > Issue Type: Bug > Components: Mesos > Affects Versions: 1.5.1 > Environment: Ubuntu 14.04 > Reporter: Klaus Ma > > I'm working on integration between Mesos & Spark. For now, I can start > SlaveMesosDispatcher in a docker; and I like to also run Spark executor in > Mesos docker. I do the following configuration for it, but I got an error; > any suggestion? > Configuration: > Spark: conf/spark-defaults.conf > {code} > spark.mesos.executor.docker.image ubuntu > spark.mesos.executor.docker.volumes > /usr/bin:/usr/bin,/usr/local/lib:/usr/local/lib,/usr/lib:/usr/lib,/lib:/lib,/home/test/workshop/spark:/root/spark > spark.mesos.executor.home /root/spark > #spark.executorEnv.SPARK_HOME /root/spark > spark.executorEnv.MESOS_NATIVE_LIBRARY /usr/local/lib > {code} > NOTE: The spark are installed in /home/test/workshop/spark, and all > dependencies are installed. > After submit SparkPi to the dispatcher, the driver job is started but failed. > The error messes is: > {code} > I1015 11:10:29.488456 18697 exec.cpp:134] Version: 0.26.0 > I1015 11:10:29.506619 18699 exec.cpp:208] Executor registered on slave > b7e24114-7585-40bc-879b-6a1188cb65b6-S1 > WARNING: Your kernel does not support swap limit capabilities, memory limited > without swap. > /bin/sh: 1: ./bin/spark-submit: not found > {code} > Does any know how to map/set spark home in docker for this case? -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org