Hello,

If you are looking for the command to submit the following command works:

spark-submit --class "SampleTest" --master yarn-cluster --num-executors
4 --executor-cores
2 /home/priya/Spark/Func1/target/scala-2.10/simple-project_2.10-1.0.jar

On Tue, Apr 7, 2015 at 6:36 PM, Veena Basavaraj <vybs.apa...@gmail.com>
wrote:

>
> The following might be helpful.
>
>
> http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/What-dependencies-to-submit-Spark-jobs-programmatically-not-via/td-p/24721
>
> http://blog.sequenceiq.com/blog/2014/08/22/spark-submit-in-java/
>
> On 7 April 2015 at 16:32, michal.klo...@gmail.com <michal.klo...@gmail.com
> > wrote:
>
>> A SparkContext can submit jobs remotely.
>>
>> The spark-submit options in general can be populated into a SparkConf and
>> passed in when you create a SparkContext.
>>
>> We personally have not had too much success with yarn-client remote
>> submission, but standalone cluster mode was easy to get going.
>>
>> M
>>
>>
>>
>> On Apr 7, 2015, at 7:01 PM, Prashant Kommireddi <prash1...@gmail.com>
>> wrote:
>>
>> Hello folks,
>>
>> Newbie here! Just had a quick question - is there a job submission API
>> such as the one with hadoop
>>
>> https://hadoop.apache.org/docs/r2.3.0/api/org/apache/hadoop/mapreduce/Job.html#submit()
>> to submit Spark jobs to a Yarn cluster? I see in example that
>> bin/spark-submit is what's out there, but couldn't find any APIs around it.
>>
>> Thanks,
>> Prashant
>>
>>
>
>
> --
> Regards
> vybs
>



-- 
Regards,
Haripriya Ayyalasomayajula
Graduate Student
Department of Computer Science
University of Houston
Contact : 650-796-7112

Reply via email to