Hi all

when I execute:
/spark-1.1.1-bin-hadoop2.4/bin/spark-submit --verbose --master yarn-cluster 
--class spark.SimpleApp --jars 
/spark-1.1.1-bin-hadoop2.4/lib/spark-assembly-1.1.1-hadoop2.4.0.jar 
--executor-memory 1G --num-executors 2 
/spark-1.1.1-bin-hadoop2.4/testfile/simple-project-1.0.jar

It comes to this error:
appDiagnostics: Application application_1417686359838_0011 failed 2 times due 
to AM Container for appattempt_1417686359838_0011_000002 exited with  exitCode: 
-1000 due to: Resource 
hdfs://192.168.70.23:9000/user/root/.sparkStaging/application_1417686359838_0011/spark-assembly-1.1.1-hadoop2.4.0.jar
 changed on src filesystem (expected 1417745205278, was 1417745206168

when I remove the --jars option, namely execute:
/spark-1.1.1-bin-hadoop2.4/bin/spark-submit --verbose --master yarn-cluster 
--class spark.SimpleApp --executor-memory 1G --num-executors 2 
/spark-1.1.1-bin-hadoop2.4/testfile/simple-project-1.0.jar

Then it comes to the error below:
appDiagnostics: Application application_1417686359838_0012 failed 2 times due 
to AM Container for appattempt_1417686359838_0012_000002 exited with  exitCode: 
-1000 due to: File does not exist: 
hdfs://192.168.70.23:9000/user/root/.sparkStaging/application_1417686359838_0012/spark-assembly-1.1.1-hadoop2.4.0.jar
.Failing this attempt.. Failing the application.

So my question is :

1.       Will spark automatically upload the assembly jar to HDFS? Is it needed 
to upload it by using --jars option manually?

2.       If needed to upload with --jars option, what is the reason of the 
"changed on src filesystem" error? How to solve it? And I have try to delete 
all the jar files in ".sparkStaging" folder, but the error still exists.

I'll appreciate it very much if anyone help me with this. Thanks in advance.

Thanks
Best Regard
LEO HU
CD&SP
SAP LABS CHINA

Reply via email to