Hi Klaus,

Sorry not next to a computer but it could possibily be a bug that it doesn't 
take SPARK_HOME as the base path. Currently the spark image seems to set the 
working directory so that it works. 

I'll look at the code to verify but seems like it could be the case. If it's 
true feel free to create a JIRA and/or provide a fix.

Tim

> On Oct 16, 2015, at 9:28 AM, Klaus Ma <kl...@cguru.net> wrote:
> 
> Hi team,
> 
> 
> 
> I'm working on integration between Mesos & Spark. For now, I can start 
> SlaveMesosDispatcher in a docker; and I like to also run Spark executor in 
> Mesos docker. I do the following configuration for it, but I got an error; 
> any suggestion?
> 
> Configuration:
> 
> Spark: conf/spark-defaults.conf
> 
> spark.mesos.executor.docker.image        ubuntu
> spark.mesos.executor.docker.volumes      
> /usr/bin:/usr/bin,/usr/local/lib:/usr/local/lib,/usr/lib:/usr/lib,/lib:/lib,/home/test/workshop/spark:/root/spark
> spark.mesos.executor.home                /root/spark
> #spark.executorEnv.SPARK_HOME             /root/spark
> spark.executorEnv.MESOS_NATIVE_LIBRARY   /usr/local/lib
> NOTE: The spark are installed in /home/test/workshop/spark, and all 
> dependencies are installed.
> 
> After submit SparkPi to the dispatcher, the driver job is started but failed. 
> The error messes is:
> 
> I1015 11:10:29.488456 18697 exec.cpp:134] Version: 0.26.0
> I1015 11:10:29.506619 18699 exec.cpp:208] Executor registered on slave 
> b7e24114-7585-40bc-879b-6a1188cb65b6-S1
> WARNING: Your kernel does not support swap limit capabilities, memory limited 
> without swap.
> /bin/sh: 1: ./bin/spark-submit: not found
> Does any know how to map/set spark home in docker for this case?
> 
> 
> ---- 
> Da (Klaus), Ma (马达) | PMP® | Advisory Software Engineer 
> Platform Symphony/DCOS Development & Support, STG, IBM GCG 
> +86-10-8245 4084 | mad...@cn.ibm.com | http://www.cguru.net

Reply via email to