when i change to default coarse-grained, it’s ok.

> On Mar 14, 2016, at 21:55, sjk <shijinkui...@163.com> wrote:
> 
> hi,all, when i run task on mesos, task error below.  for help, thanks a lot.
> 
> 
> cluster mode, command:
> 
> $SPARK_HOME/spark-submit --class com.xxx.ETL --master 
> mesos://192.168.191.116:7077 --deploy-mode cluster --supervise 
> --driver-memory 2G --executor-memory 10G —
> total-executor-cores 4 http://jar.xxx.info/streaming-etl-assembly-1.0.jar 
> 
> 
> task stderr:
> 
> 
> I0314 21:13:17.520845 29008 fetcher.cpp:424] Fetcher Info: 
> {"cache_directory":"\/tmp\/mesos\/fetch\/slaves\/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7\/appweb","items":[{"action":"BYPASS_CACHE","uri":{"extract":true,"value":"\/data\/program\/spark-1.6.0-bin-hadoop2.6.tgz"}}],"sandbox_directory":"\/data\/mesos\/slaves\/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7\/frameworks\/dd8e95f7-3626-4e46-b48c-b3b58b573c4d-0044\/executors\/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7\/runs\/92509aa4-7804-459b-857d-cfc08c31a993","user":"appweb"}
> I0314 21:13:17.522541 29008 fetcher.cpp:379] Fetching URI 
> '/data/program/spark-1.6.0-bin-hadoop2.6.tgz'
> I0314 21:13:17.522562 29008 fetcher.cpp:250] Fetching directly into the 
> sandbox directory
> I0314 21:13:17.522586 29008 fetcher.cpp:187] Fetching URI 
> '/data/program/spark-1.6.0-bin-hadoop2.6.tgz'
> I0314 21:13:17.522603 29008 fetcher.cpp:167] Copying resource with command:cp 
> '/data/program/spark-1.6.0-bin-hadoop2.6.tgz' 
> '/data/mesos/slaves/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7/frameworks/dd8e95f7-3626-4e46-b48c-b3b58b573c4d-0044/executors/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7/runs/92509aa4-7804-459b-857d-cfc08c31a993/spark-1.6.0-bin-hadoop2.6.tgz'
> I0314 21:13:17.880008 29008 fetcher.cpp:84] Extracting with command: tar -C 
> '/data/mesos/slaves/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7/frameworks/dd8e95f7-3626-4e46-b48c-b3b58b573c4d-0044/executors/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7/runs/92509aa4-7804-459b-857d-cfc08c31a993'
>  -xf 
> '/data/mesos/slaves/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7/frameworks/dd8e95f7-3626-4e46-b48c-b3b58b573c4d-0044/executors/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7/runs/92509aa4-7804-459b-857d-cfc08c31a993/spark-1.6.0-bin-hadoop2.6.tgz'
> I0314 21:13:20.911213 29008 fetcher.cpp:92] Extracted 
> '/data/mesos/slaves/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7/frameworks/dd8e95f7-3626-4e46-b48c-b3b58b573c4d-0044/executors/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7/runs/92509aa4-7804-459b-857d-cfc08c31a993/spark-1.6.0-bin-hadoop2.6.tgz'
>  into 
> '/data/mesos/slaves/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7/frameworks/dd8e95f7-3626-4e46-b48c-b3b58b573c4d-0044/executors/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7/runs/92509aa4-7804-459b-857d-cfc08c31a993'
> I0314 21:13:20.911278 29008 fetcher.cpp:456] Fetched 
> '/data/program/spark-1.6.0-bin-hadoop2.6.tgz' to 
> '/data/mesos/slaves/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7/frameworks/dd8e95f7-3626-4e46-b48c-b3b58b573c4d-0044/executors/c2f100e1-13a8-40d9-a00f-68389300dfc1-S7/runs/92509aa4-7804-459b-857d-cfc08c31a993/spark-1.6.0-bin-hadoop2.6.tgz'
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> org/apache/spark/launcher/Main
> Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.Main
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:323)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
> Could not find the main class: org.apache.spark.launcher.Main. Program will 
> exit.
> 
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 



---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to