Hi,
Since it is not currently possible to submit a spark job to a spark cluster
running in standalone mode (cluster mode - it's not currently possible to
specify this deploy mode within the code), can I do it with YARN?
I tried to do something like this (but in scala):
«
... // Client object - main
methodSystem.setProperty("SPARK_YARN_MODE", "true")val sparkConf = new
SparkConf()try { val args = new ClientArguments(argStrings,
sparkConf) new Client(args, sparkConf).run()} catch { case e:
Exception => { Console.err.println(e.getMessage) System.exit(1)
}}System.exit(0)
» in http://blog.sequenceiq.com/blog/2014/08/22/spark-submit-in-java/
However it is not possible to create a new instance of Client since
import org.apache.spark.deploy.yarn.Client is private
Is there any way I can submit spark jobs from the code in cluster mode
and not using the spark-submit script?
Thanks.