[jira] [Comment Edited] (SPARK-6461) spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos

2015-03-23 Thread Littlestar (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6461?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14375468#comment-14375468
 ] 

Littlestar edited comment on SPARK-6461 at 3/23/15 9:29 AM:


each mesos slave node has JAVA and HADOOP DataNode.

Now I add  the following setting to mesos-master-env.sh and mesos-slave-env.sh.
 export MESOS_JAVA_HOME=/home/test/jdk
 export MESOS_HADOOP_HOME=/home/test/hadoop-2.4.0
export MESOS_HADOOP_CONF_DIR=/home/test/hadoop-2.4.0/etc/hadoop
 export 
MESOS_PATH=/home/test/jdk/bin:/home/test/hadoop-2.4.0/sbin:/home/test/hadoop-2.4.0/bin:/sbin:/bin:/usr/sbin:/usr/bin

 /usr/bin/env: bash: No such file or directory

thanks.



was (Author: cnstar9988):
each mesos slave node has JAVA and HADOOP DataNode.

I also add  the following setting to mesos-master-env.sh and mesos-slave-env.sh.
 export MESOS_JAVA_HOME=/home/test/jdk
 export MESOS_HADOOP_HOME=/home/test/hadoop-2.4.0
export MESOS_HADOOP_CONF_DIR=/home/test/hadoop-2.4.0/etc/hadoop
 export 
MESOS_PATH=/home/test/jdk/bin:/home/test/hadoop-2.4.0/sbin:/home/test/hadoop-2.4.0/bin:/sbin:/bin:/usr/sbin:/usr/bin

 /usr/bin/env: bash: No such file or directory

thanks.


> spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos
> --
>
> Key: SPARK-6461
> URL: https://issues.apache.org/jira/browse/SPARK-6461
> Project: Spark
>  Issue Type: Bug
>  Components: Scheduler
>Affects Versions: 1.3.0
>Reporter: Littlestar
>
> I use mesos run spak 1.3.0 ./run-example SparkPi
> but failed.
> spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos
> spark.executorEnv.PATH
> spark.executorEnv.HADOOP_HOME
> spark.executorEnv.JAVA_HOME
> E0323 14:24:36.400635 11355 fetcher.cpp:109] HDFS copyToLocal failed: hadoop 
> fs -copyToLocal 
> 'hdfs://192.168.1.9:54310/home/test/spark-1.3.0-bin-2.4.0.tar.gz' 
> '/home/mesos/work_dir/slaves/20150323-100710-1214949568-5050-3453-S3/frameworks/20150323-133400-1214949568-5050-15440-0007/executors/20150323-100710-1214949568-5050-3453-S3/runs/915b40d8-f7c4-428a-9df8-ac9804c6cd21/spark-1.3.0-bin-2.4.0.tar.gz'
> sh: hadoop: command not found



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-6461) spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos

2015-03-23 Thread Littlestar (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6461?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14375580#comment-14375580
 ] 

Littlestar edited comment on SPARK-6461 at 3/23/15 9:04 AM:


when I add MESOS_HADOOP_CONF_DIR at all mesos-master-env.sh and 
mesos-slave-env.sh , It throws the following error.
Exception in thread "main" java.lang.NoClassDefFoundError: 
org/apache/spark/executor/MesosExecutorBackend
 Caused by: java.lang.ClassNotFoundException: 
org.apache.spark.executor.MesosExecutorBackend

similar to https://issues.apache.org/jira/browse/SPARK-1702


was (Author: cnstar9988):
when I add MESOS_HADOOP_CONF_DIR at all mesos-master-env.sh and 
mesos-slave-env.sh , It throws the following error.
Exception in thread "main" java.lang.NoClassDefFoundError: 
org/apache/spark/executor/MesosExecutorBackend
 Caused by: java.lang.ClassNotFoundException: 
org.apache.spark.executor.MesosExecutorBackend

similar to https://github.com/apache/spark/pull/620

> spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos
> --
>
> Key: SPARK-6461
> URL: https://issues.apache.org/jira/browse/SPARK-6461
> Project: Spark
>  Issue Type: Bug
>  Components: Scheduler
>Affects Versions: 1.3.0
>Reporter: Littlestar
>
> I use mesos run spak 1.3.0 ./run-example SparkPi
> but failed.
> spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos
> spark.executorEnv.PATH
> spark.executorEnv.HADOOP_HOME
> spark.executorEnv.JAVA_HOME
> E0323 14:24:36.400635 11355 fetcher.cpp:109] HDFS copyToLocal failed: hadoop 
> fs -copyToLocal 
> 'hdfs://192.168.1.9:54310/home/test/spark-1.3.0-bin-2.4.0.tar.gz' 
> '/home/mesos/work_dir/slaves/20150323-100710-1214949568-5050-3453-S3/frameworks/20150323-133400-1214949568-5050-15440-0007/executors/20150323-100710-1214949568-5050-3453-S3/runs/915b40d8-f7c4-428a-9df8-ac9804c6cd21/spark-1.3.0-bin-2.4.0.tar.gz'
> sh: hadoop: command not found



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-6461) spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos

2015-03-23 Thread Littlestar (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6461?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14375580#comment-14375580
 ] 

Littlestar edited comment on SPARK-6461 at 3/23/15 8:49 AM:


when I add MESOS_HADOOP_CONF_DIR at all mesos-master-env.sh and 
mesos-slave-env.sh , It throws the following error.
Exception in thread "main" java.lang.NoClassDefFoundError: 
org/apache/spark/executor/MesosExecutorBackend
 Caused by: java.lang.ClassNotFoundException: 
org.apache.spark.executor.MesosExecutorBackend

similar to https://github.com/apache/spark/pull/620


was (Author: cnstar9988):
when I add MESOS_HADOOP_CONF_DIR at all mesos-master-env.sh and 
mesos-slave-env.sh , It throws the following error.
Exception in thread "main" java.lang.NoClassDefFoundError: 
org/apache/spark/executor/MesosExecutorBackend
 Caused by: java.lang.ClassNotFoundException: 
org.apache.spark.executor.MesosExecutorBackend

> spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos
> --
>
> Key: SPARK-6461
> URL: https://issues.apache.org/jira/browse/SPARK-6461
> Project: Spark
>  Issue Type: Bug
>  Components: Scheduler
>Affects Versions: 1.3.0
>Reporter: Littlestar
>
> I use mesos run spak 1.3.0 ./run-example SparkPi
> but failed.
> spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos
> spark.executorEnv.PATH
> spark.executorEnv.HADOOP_HOME
> spark.executorEnv.JAVA_HOME
> E0323 14:24:36.400635 11355 fetcher.cpp:109] HDFS copyToLocal failed: hadoop 
> fs -copyToLocal 
> 'hdfs://192.168.1.9:54310/home/test/spark-1.3.0-bin-2.4.0.tar.gz' 
> '/home/mesos/work_dir/slaves/20150323-100710-1214949568-5050-3453-S3/frameworks/20150323-133400-1214949568-5050-15440-0007/executors/20150323-100710-1214949568-5050-3453-S3/runs/915b40d8-f7c4-428a-9df8-ac9804c6cd21/spark-1.3.0-bin-2.4.0.tar.gz'
> sh: hadoop: command not found



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-6461) spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos

2015-03-23 Thread Littlestar (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6461?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14375468#comment-14375468
 ] 

Littlestar edited comment on SPARK-6461 at 3/23/15 8:39 AM:


each mesos slave node has JAVA and HADOOP DataNode.

I also add  the following setting to mesos-master-env.sh and mesos-slave-env.sh.
 export MESOS_JAVA_HOME=/home/test/jdk
 export MESOS_HADOOP_HOME=/home/test/hadoop-2.4.0
export MESOS_HADOOP_CONF_DIR=/home/test/hadoop-2.4.0/etc/hadoop
 export 
MESOS_PATH=/home/test/jdk/bin:/home/test/hadoop-2.4.0/sbin:/home/test/hadoop-2.4.0/bin:/sbin:/bin:/usr/sbin:/usr/bin

 /usr/bin/env: bash: No such file or directory

thanks.



was (Author: cnstar9988):
each mesos slave node has JAVA and HADOOP DataNode.

I also add  the following setting to mesos-master-env.sh and mesos-slave-env.sh.
 export MESOS_JAVA_HOME=/home/test/jdk
 export MESOS_HADOOP_HOME=/home/test/hadoop-2.4.0
 export 
MESOS_PATH=/home/test/jdk/bin:/home/test/hadoop-2.4.0/sbin:/home/test/hadoop-2.4.0/bin:/sbin:/bin:/usr/sbin:/usr/bin

 /usr/bin/env: bash: No such file or directory

thanks.


> spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos
> --
>
> Key: SPARK-6461
> URL: https://issues.apache.org/jira/browse/SPARK-6461
> Project: Spark
>  Issue Type: Bug
>  Components: Scheduler
>Affects Versions: 1.3.0
>Reporter: Littlestar
>
> I use mesos run spak 1.3.0 ./run-example SparkPi
> but failed.
> spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos
> spark.executorEnv.PATH
> spark.executorEnv.HADOOP_HOME
> spark.executorEnv.JAVA_HOME
> E0323 14:24:36.400635 11355 fetcher.cpp:109] HDFS copyToLocal failed: hadoop 
> fs -copyToLocal 
> 'hdfs://192.168.1.9:54310/home/test/spark-1.3.0-bin-2.4.0.tar.gz' 
> '/home/mesos/work_dir/slaves/20150323-100710-1214949568-5050-3453-S3/frameworks/20150323-133400-1214949568-5050-15440-0007/executors/20150323-100710-1214949568-5050-3453-S3/runs/915b40d8-f7c4-428a-9df8-ac9804c6cd21/spark-1.3.0-bin-2.4.0.tar.gz'
> sh: hadoop: command not found



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org