Hi Yana - 

I added the following to spark-class:

echo RUNNER: $RUNNER
echo CLASSPATH: $CLASSPATH
echo JAVA_OPTS: $JAVA_OPTS
echo '$@': $@

Here's the output:

$ ./spark-submit --class experiments.SimpleApp --master
spark://myhost.local:7077
/IdeaProjects/spark-experiments/target/spark-experiments-1.0-SNAPSHOT.jar

Spark assembly has been built with Hive, including Datanucleus jars on
classpath

RUNNER:
/Library/Java/JavaVirtualMachines/jdk1.7.0_13.jdk/Contents/Home/bin/java

CLASSPATH:
::/dev/spark-1.0.2-bin-hadoop2/conf:/dev/spark-1.0.2-bin-hadoop2/lib/spark-assembly-1.0.2-hadoop2.2.0.jar:/dev/spark-1.0.2-bin-hadoop2/lib/datanucleus-api-jdo-3.2.1.jar:/dev/spark-1.0.2-bin-hadoop2/lib/datanucleus-core-3.2.2.jar:/dev/spark-1.0.2-bin-hadoop2/lib/datanucleus-rdbms-3.2.1.jar

JAVA_OPTS: -XX:MaxPermSize=128m -Djava.library.path= -Xms512m -Xmx512m

$@: org.apache.spark.deploy.SparkSubmit --class experiments.SimpleApp
--master spark://myhost.local:7077
/IdeaProjects/spark-experiments/target/spark-experiments-1.0-SNAPSHOT.jar

The differences I can see in the code that runs via my standalone Java app:
- Does not have -Djava.library.path=  (should not make a difference)
- Main class is org.apache.spark.executor.CoarseGrainedExecutorBackend
instead of org.apache.spark.deploy.SparkSubmit (should not make a
difference)
- My jar's classes are directly available when running via spark-submit (it
runs the Jar so it they will be in the main classloader) but they are only
available via conf.setJars() in the standalone Java app.  But they should be
available indirectly in the classloader that is created in the executor:

14/09/08 10:04:06 INFO Executor: Adding
file:/dev/spark-1.0.2-bin-hadoop2/work/app-20140908100358-0002/1/./spark-experiments-1.0-SNAPSHOT.jar
to class loader

I've been assuming that my conf.setJars() is the proper way to provide my
code to Spark.  

Thanks!




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-run-SimpleApp-as-regular-Java-app-tp13695p13842.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to