Yeah I was going to suggest looking at the code too. It's a shame there isn't a page in the docs that covers the port 6066 rest api.
On Tue, Oct 6, 2015 at 10:16 AM, Ted Yu <yuzhih...@gmail.com> wrote: > Please take a look at: > org.apache.spark.deploy.rest.RestSubmissionClient > > which is used > by core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala > > FYI > > On Tue, Oct 6, 2015 at 10:08 AM, shahid qadri <shahidashr...@icloud.com> > wrote: > >> hi Jeff >> Thanks >> More specifically i need the Rest api to submit pyspark job, can you >> point me to Spark submit REST api >> >> On Oct 6, 2015, at 10:25 PM, Jeff Nadler <jnad...@srcginc.com> wrote: >> >> >> Spark standalone doesn't come with a UI for submitting jobs. Some >> Hadoop distros might, for example EMR in AWS has a job submit UI. >> >> Spark submit just calls a REST api, you could build any UI you want on >> top of that... >> >> >> On Tue, Oct 6, 2015 at 9:37 AM, shahid qadri <shahidashr...@icloud.com> >> wrote: >> >>> Hi Folks >>> >>> How i can submit my spark app(python) to the cluster without using >>> spark-submit, actually i need to invoke jobs from UI >>> --------------------------------------------------------------------- >>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >>> For additional commands, e-mail: user-h...@spark.apache.org >>> >>> >> >> >