Re: which master option to view current running job in Spark UI
Hi Jeff , The issues with EC2 logs view . Had to set up SSH tunnels to view the current running job. Thanks, Divya On 24 February 2016 at 10:33, Jeff Zhangwrote: > View running job in SPARK UI doesn't matter which master you use. What do > you mean "I cant see the currently running jobs in Spark WEB UI" ? Do you > see a blank spark ui or can't open the spark ui ? > > On Mon, Feb 15, 2016 at 12:55 PM, Sabarish Sasidharan < > sabarish.sasidha...@manthan.com> wrote: > >> When running in YARN, you can use the YARN Resource Manager UI to get to >> the ApplicationMaster url, irrespective of client or cluster mode. >> >> Regards >> Sab >> On 15-Feb-2016 10:10 am, "Divya Gehlot" wrote: >> >>> Hi, >>> I have Hortonworks 2.3.4 cluster on EC2 and Have spark jobs as scala >>> files . >>> I am bit confused between using *master *options >>> I want to execute this spark job in YARN >>> >>> Curently running as >>> spark-shell --properties-file /TestDivya/Spark/Oracle.properties --jars >>> /usr/hdp/2.3.4.0-3485/spark/lib/ojdbc6.jar --driver-class-path >>> /usr/hdp/2.3.4.0-3485/spark/lib/ojdbc6.jar --packages >>> com.databricks:spark-csv_2.10:1.1.0 *--master yarn-client * -i >>> /TestDivya/Spark/Test.scala >>> >>> with this option I cant see the currently running jobs in Spark WEB UI >>> though it later appear in spark history server. >>> >>> My question with which --master option should I run my spark jobs so >>> that I can view the currently running jobs in spark web UI . >>> >>> Thanks, >>> Divya >>> >> > > > -- > Best Regards > > Jeff Zhang >
Re: which master option to view current running job in Spark UI
View running job in SPARK UI doesn't matter which master you use. What do you mean "I cant see the currently running jobs in Spark WEB UI" ? Do you see a blank spark ui or can't open the spark ui ? On Mon, Feb 15, 2016 at 12:55 PM, Sabarish Sasidharan < sabarish.sasidha...@manthan.com> wrote: > When running in YARN, you can use the YARN Resource Manager UI to get to > the ApplicationMaster url, irrespective of client or cluster mode. > > Regards > Sab > On 15-Feb-2016 10:10 am, "Divya Gehlot"wrote: > >> Hi, >> I have Hortonworks 2.3.4 cluster on EC2 and Have spark jobs as scala >> files . >> I am bit confused between using *master *options >> I want to execute this spark job in YARN >> >> Curently running as >> spark-shell --properties-file /TestDivya/Spark/Oracle.properties --jars >> /usr/hdp/2.3.4.0-3485/spark/lib/ojdbc6.jar --driver-class-path >> /usr/hdp/2.3.4.0-3485/spark/lib/ojdbc6.jar --packages >> com.databricks:spark-csv_2.10:1.1.0 *--master yarn-client * -i >> /TestDivya/Spark/Test.scala >> >> with this option I cant see the currently running jobs in Spark WEB UI >> though it later appear in spark history server. >> >> My question with which --master option should I run my spark jobs so that >> I can view the currently running jobs in spark web UI . >> >> Thanks, >> Divya >> > -- Best Regards Jeff Zhang
Re: which master option to view current running job in Spark UI
When running in YARN, you can use the YARN Resource Manager UI to get to the ApplicationMaster url, irrespective of client or cluster mode. Regards Sab On 15-Feb-2016 10:10 am, "Divya Gehlot"wrote: > Hi, > I have Hortonworks 2.3.4 cluster on EC2 and Have spark jobs as scala files > . > I am bit confused between using *master *options > I want to execute this spark job in YARN > > Curently running as > spark-shell --properties-file /TestDivya/Spark/Oracle.properties --jars > /usr/hdp/2.3.4.0-3485/spark/lib/ojdbc6.jar --driver-class-path > /usr/hdp/2.3.4.0-3485/spark/lib/ojdbc6.jar --packages > com.databricks:spark-csv_2.10:1.1.0 *--master yarn-client * -i > /TestDivya/Spark/Test.scala > > with this option I cant see the currently running jobs in Spark WEB UI > though it later appear in spark history server. > > My question with which --master option should I run my spark jobs so that > I can view the currently running jobs in spark web UI . > > Thanks, > Divya >
which master option to view current running job in Spark UI
Hi, I have Hortonworks 2.3.4 cluster on EC2 and Have spark jobs as scala files . I am bit confused between using *master *options I want to execute this spark job in YARN Curently running as spark-shell --properties-file /TestDivya/Spark/Oracle.properties --jars /usr/hdp/2.3.4.0-3485/spark/lib/ojdbc6.jar --driver-class-path /usr/hdp/2.3.4.0-3485/spark/lib/ojdbc6.jar --packages com.databricks:spark-csv_2.10:1.1.0 *--master yarn-client * -i /TestDivya/Spark/Test.scala with this option I cant see the currently running jobs in Spark WEB UI though it later appear in spark history server. My question with which --master option should I run my spark jobs so that I can view the currently running jobs in spark web UI . Thanks, Divya