HI,

I'm a junior user of spark from China.

I have a problem about submit spark job right now. I want to submit job
from code.

In other words ,"How to submit spark job from within java program to yarn
cluster without using spark-submit"


       I've learnt from official site
http://spark.apache.org/docs/latest/submitting-applications.html

that using  bin/spark-submit script to submit a job to cluster is easy .


       Because the script may does lots of complex work such as setting up
the classpath with Spark and its dependencies.

If I don't use the script ,I have to deal with all complex work by
myself.It makes me feel really frustrated.


       I have search this problem from Google,but the answers may not suit
for me .


       In hadoop developing ,I know that after setting up Configuration
,Job and resources ,

we can submit hadoop job by coding like this:

job.waitForCompletion

It is convenient for users to submit job programmatically


I want to know if there is a schedule( may be in spark 1.5+?)that provide
users variety ways of submitting job like hadoop .

Like monitoring ,In the recent release spark(1.4.0) We can get statements
about spark applications by REST API right now.


Thanks & Regards

GUO QIAN

Reply via email to