[jira] [Updated] (SPARK-11143) SparkMesosDispatcher can not launch driver in docker

2019-05-20 Thread Hyukjin Kwon (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-11143?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon updated SPARK-11143:
-
Labels: bulk-closed  (was: )

> SparkMesosDispatcher can not launch driver in docker
> 
>
> Key: SPARK-11143
> URL: https://issues.apache.org/jira/browse/SPARK-11143
> Project: Spark
>  Issue Type: Bug
>  Components: Mesos
>Affects Versions: 1.5.1
> Environment: Ubuntu 14.04
>Reporter: Klaus Ma
>Priority: Major
>  Labels: bulk-closed
>
> I'm working on integration between Mesos & Spark. For now, I can start 
> SlaveMesosDispatcher in a docker; and I like to also run Spark executor in 
> Mesos docker. I do the following configuration for it, but I got an error; 
> any suggestion?
> Configuration:
> Spark: conf/spark-defaults.conf
> {code}
> spark.mesos.executor.docker.imageubuntu
> spark.mesos.executor.docker.volumes  
> /usr/bin:/usr/bin,/usr/local/lib:/usr/local/lib,/usr/lib:/usr/lib,/lib:/lib,/home/test/workshop/spark:/root/spark
> spark.mesos.executor.home/root/spark
> #spark.executorEnv.SPARK_HOME /root/spark
> spark.executorEnv.MESOS_NATIVE_LIBRARY   /usr/local/lib
> {code}
> NOTE: The spark are installed in /home/test/workshop/spark, and all 
> dependencies are installed.
> After submit SparkPi to the dispatcher, the driver job is started but failed. 
> The error messes is:
> {code}
> I1015 11:10:29.488456 18697 exec.cpp:134] Version: 0.26.0
> I1015 11:10:29.506619 18699 exec.cpp:208] Executor registered on slave 
> b7e24114-7585-40bc-879b-6a1188cb65b6-S1
> WARNING: Your kernel does not support swap limit capabilities, memory limited 
> without swap.
> /bin/sh: 1: ./bin/spark-submit: not found
> {code}
> Does any know how to map/set spark home in docker for this case?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-11143) SparkMesosDispatcher can not launch driver in docker

2015-10-15 Thread Klaus Ma (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-11143?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Klaus Ma updated SPARK-11143:
-
Description: 
I'm working on integration between Mesos & Spark. For now, I can start 
SlaveMesosDispatcher in a docker; and I like to also run Spark executor in 
Mesos docker. I do the following configuration for it, but I got an error; any 
suggestion?

Configuration:

Spark: conf/spark-defaults.conf

{code}
spark.mesos.executor.docker.imageubuntu
spark.mesos.executor.docker.volumes  
/usr/bin:/usr/bin,/usr/local/lib:/usr/local/lib,/usr/lib:/usr/lib,/lib:/lib,/home/test/workshop/spark:/root/spark
spark.mesos.executor.home/root/spark
#spark.executorEnv.SPARK_HOME /root/spark
spark.executorEnv.MESOS_NATIVE_LIBRARY   /usr/local/lib
{code}

NOTE: The spark are installed in /home/test/workshop/spark, and all 
dependencies are installed.

After submit SparkPi to the dispatcher, the driver job is started but failed. 
The error messes is:
{code}
I1015 11:10:29.488456 18697 exec.cpp:134] Version: 0.26.0
I1015 11:10:29.506619 18699 exec.cpp:208] Executor registered on slave 
b7e24114-7585-40bc-879b-6a1188cb65b6-S1
WARNING: Your kernel does not support swap limit capabilities, memory limited 
without swap.
/bin/sh: 1: ./bin/spark-submit: not found
{code}
Does any know how to map/set spark home in docker for this case?

  was:
I'm working on integration between Mesos & Spark. For now, I can start 
SlaveMesosDispatcher in a docker; and I like to also run Spark executor in 
Mesos docker. I do the following configuration for it, but I got an error; any 
suggestion?

Configuration:

Spark: conf/spark-defaults.conf

spark.mesos.executor.docker.imageubuntu
spark.mesos.executor.docker.volumes  
/usr/bin:/usr/bin,/usr/local/lib:/usr/local/lib,/usr/lib:/usr/lib,/lib:/lib,/home/test/workshop/spark:/root/spark
spark.mesos.executor.home/root/spark
#spark.executorEnv.SPARK_HOME /root/spark
spark.executorEnv.MESOS_NATIVE_LIBRARY   /usr/local/lib

NOTE: The spark are installed in /home/test/workshop/spark, and all 
dependencies are installed.

After submit SparkPi to the dispatcher, the driver job is started but failed. 
The error messes is:

I1015 11:10:29.488456 18697 exec.cpp:134] Version: 0.26.0
I1015 11:10:29.506619 18699 exec.cpp:208] Executor registered on slave 
b7e24114-7585-40bc-879b-6a1188cb65b6-S1
WARNING: Your kernel does not support swap limit capabilities, memory limited 
without swap.
/bin/sh: 1: ./bin/spark-submit: not found

Does any know how to map/set spark home in docker for this case?


> SparkMesosDispatcher can not launch driver in docker
> 
>
> Key: SPARK-11143
> URL: https://issues.apache.org/jira/browse/SPARK-11143
> Project: Spark
>  Issue Type: Bug
>  Components: Mesos
>Affects Versions: 1.5.1
> Environment: Ubuntu 14.04
>Reporter: Klaus Ma
>
> I'm working on integration between Mesos & Spark. For now, I can start 
> SlaveMesosDispatcher in a docker; and I like to also run Spark executor in 
> Mesos docker. I do the following configuration for it, but I got an error; 
> any suggestion?
> Configuration:
> Spark: conf/spark-defaults.conf
> {code}
> spark.mesos.executor.docker.imageubuntu
> spark.mesos.executor.docker.volumes  
> /usr/bin:/usr/bin,/usr/local/lib:/usr/local/lib,/usr/lib:/usr/lib,/lib:/lib,/home/test/workshop/spark:/root/spark
> spark.mesos.executor.home/root/spark
> #spark.executorEnv.SPARK_HOME /root/spark
> spark.executorEnv.MESOS_NATIVE_LIBRARY   /usr/local/lib
> {code}
> NOTE: The spark are installed in /home/test/workshop/spark, and all 
> dependencies are installed.
> After submit SparkPi to the dispatcher, the driver job is started but failed. 
> The error messes is:
> {code}
> I1015 11:10:29.488456 18697 exec.cpp:134] Version: 0.26.0
> I1015 11:10:29.506619 18699 exec.cpp:208] Executor registered on slave 
> b7e24114-7585-40bc-879b-6a1188cb65b6-S1
> WARNING: Your kernel does not support swap limit capabilities, memory limited 
> without swap.
> /bin/sh: 1: ./bin/spark-submit: not found
> {code}
> Does any know how to map/set spark home in docker for this case?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org