Re: Launch a pyspark Job From UI

2018-06-11 Thread Sathishkumar Manimoorthy
You can use Zeppelin as well https://zeppelin.apache.org/docs/latest/interpreter/spark.html Thanks, Sathish On Mon, Jun 11, 2018 at 4:25 PM, hemant singh wrote: > You can explore Livy https://dzone.com/articles/quick-start-with- > apache-livy > > On Mon, Jun 11, 2018 at 3:35 PM, srungarapu vam

Re: Spark YARN Error - triggering spark-shell

2018-06-08 Thread Sathishkumar Manimoorthy
It seems, your spark-on-yarn application is not able to get it's application master, org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. Check once on yarn logs Thanks, Sathish- On Fri, Jun 8, 2018 at 2:22 PM,

Re: [Spark-Submit] Where to store data files while running job in cluster mode?

2017-09-29 Thread Sathishkumar Manimoorthy
Place it in HDFS and give the reference path in your code. Thanks, Sathish On Fri, Sep 29, 2017 at 3:31 PM, Gaurav1809 wrote: > Hi All, > > I have multi node architecture of (1 master,2 workers) Spark cluster, the > job runs to read CSV file data and it works fine when run on local mode > (Loca

Re: Debugging Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

2017-09-26 Thread Sathishkumar Manimoorthy
@Ayan It seems to be running on spark standalone. Not mostly on Yarn I guess. Thanks, Sathish On Tue, Sep 26, 2017 at 9:09 PM, ayan guha wrote: > I would check the queue you are submitting job, assuming it is yarn... > > On Tue, Sep 26, 2017 at 11:40 PM, JG Perrin wrote: > >> Hi, >> >> >> >>