[ 
https://issues.apache.org/jira/browse/SPARK-11759?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15010992#comment-15010992
 ] 

Balaji Ramamoorthy commented on SPARK-11759:
--------------------------------------------

But i do see the container created. 

{code}
828104a15f91        xyz/mesos-spark                                             
    "/bin/sh -c './bin/s   10 seconds ago      Exited (1) 3 seconds ago         
                                                
                    
mesos-62488d9a-0db8-43c4-9b1a-79c6d74d50d9-S0.23e41b7b-1826-463e-aa66-76d3eac9fa5f
   
{code}

This is how i submit the job

{code}
./bin/spark-submit --deploy-mode cluster  --driver-memory 1G --master 
mesos://192.168.19.46:7077 --executor-memory 512m --conf 
spark.mesos.coarse=true --conf 
spark.mesos.executor.docker.image=xyz/mesos-spark --conf 
spark.mesos.executor.home=/opt/spark --conf spark.mesos.extra.cores=1 --conf 
spark.driver.host=192.168.19.46 --conf --executor-cores 1 --class 
org.apache.spark.examples.SparkPi 
http://192.168.19.46/lib/spark-examples-1.5.1-hadoop2.6.0.jar 100
{code}

[~alvis lui] If your container is not created , where are you seeing the 
spark-class.sh not found error. In mesos slave ? Are you running mesos-slave as 
a Docker container too ?

Btw, i also see the following message, which tells me that the dispatcher is in 
"Fine grain mode", even though i specified "--conf spark.mesos.coarse=true".

{code}
15/11/18 13:33:57 INFO MesosSchedulerBackend: Registered as framework ID 
e5719054-f58a-4952-893a-61530ce40144-0006
15/11/18 13:33:57 INFO Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 43173.
{code}

Am i missing something in my configuration ?

> Spark task on mesos with docker fails with sh: 1: /opt/spark/bin/spark-class: 
> not found
> ---------------------------------------------------------------------------------------
>
>                 Key: SPARK-11759
>                 URL: https://issues.apache.org/jira/browse/SPARK-11759
>             Project: Spark
>          Issue Type: Question
>          Components: Deploy, Mesos
>            Reporter: Luis Alves
>
> I'm using Spark 1.5.1 and Mesos 0.25 in cluster mode. I've the 
> spark-dispatcher running, and run spark-submit. The driver is launched, but 
> it fails because it seems that the task it launches fails.
> In the logs of the launched task I can see the following error: 
> sh: 1: /opt/spark/bin/spark-class: not found
> I checked my docker image and the  /opt/spark/bin/spark-class exists. I then 
> noticed that it's using sh, therefore I tried to run (in the docker image) 
> the following:
> sh /opt/spark/bin/spark-class org.apache.spark.deploy.master.Master
> It fails with the following error:
> spark-class: 73: spark-class: Syntax error: "(" unexpected
> Is this an error in Spark?
> Thanks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to