Here is the yarn RM REST API for you to refer (
http://hadoop.apache.org/docs/r2.7.0/hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html).
You can use these APIs to query applications running on yarn.

On Sun, Sep 11, 2016 at 11:25 PM, Jacek Laskowski <ja...@japila.pl> wrote:

> Hi Vladimir,
>
> You'd have to talk to your cluster manager to query for all the
> running Spark applications. I'm pretty sure YARN and Mesos can do that
> but unsure about Spark Standalone. This is certainly not something a
> Spark application's web UI could do for you since it is designed to
> handle the single Spark application.
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sun, Sep 11, 2016 at 11:18 AM, Vladimir Tretyakov
> <vladimir.tretya...@sematext.com> wrote:
> > Hello Jacek, thx a lot, it works.
> >
> > Is there a way how to get list of running applications from REST API? Or
> I
> > have to try connect 4040 4041... 40xx ports and check if ports answer
> > something?
> >
> > Best regards, Vladimir.
> >
> > On Sat, Sep 10, 2016 at 6:00 AM, Jacek Laskowski <ja...@japila.pl>
> wrote:
> >>
> >> Hi,
> >>
> >> That's correct. One app one web UI. Open 4041 and you'll see the other
> >> app.
> >>
> >> Jacek
> >>
> >>
> >> On 9 Sep 2016 11:53 a.m., "Vladimir Tretyakov"
> >> <vladimir.tretya...@sematext.com> wrote:
> >>>
> >>> Hello again.
> >>>
> >>> I am trying to play with Spark version "2.11-2.0.0".
> >>>
> >>> Problem that REST API and UI shows me different things.
> >>>
> >>> I've stared 2 applications from "examples set": opened 2 consoles and
> run
> >>> following command in each:
> >>>
> >>> ./bin/spark-submit   --class org.apache.spark.examples.SparkPi
>  --master
> >>> spark://wawanawna:7077  --executor-memory 2G  --total-executor-cores 30
> >>> examples/jars/spark-examples_2.11-2.0.0.jar  10000
> >>>
> >>> Request to API endpoint:
> >>>
> >>> http://localhost:4040/api/v1/applications
> >>>
> >>> returned me following JSON:
> >>>
> >>> [ {
> >>>   "id" : "app-20160909184529-0016",
> >>>   "name" : "Spark Pi",
> >>>   "attempts" : [ {
> >>>     "startTime" : "2016-09-09T15:45:25.047GMT",
> >>>     "endTime" : "1969-12-31T23:59:59.999GMT",
> >>>     "lastUpdated" : "2016-09-09T15:45:25.047GMT",
> >>>     "duration" : 0,
> >>>     "sparkUser" : "",
> >>>     "completed" : false,
> >>>     "startTimeEpoch" : 1473435925047,
> >>>     "endTimeEpoch" : -1,
> >>>     "lastUpdatedEpoch" : 1473435925047
> >>>   } ]
> >>> } ]
> >>>
> >>> so response contains information only about 1 application.
> >>>
> >>> But in reality I've started 2 applications and Spark UI shows me 2
> >>> RUNNING application (please see screenshot).
> >>>
> >>> Does anybody maybe know answer why API and UI shows different things?
> >>>
> >>>
> >>> Best regards, Vladimir.
> >>>
> >>>
> >>> On Tue, Aug 30, 2016 at 3:52 PM, Vijay Kiran <m...@vijaykiran.com>
> wrote:
> >>>>
> >>>> Hi Otis,
> >>>>
> >>>> Did you check the REST API as documented in
> >>>> http://spark.apache.org/docs/latest/monitoring.html
> >>>>
> >>>> Regards,
> >>>> Vijay
> >>>>
> >>>> > On 30 Aug 2016, at 14:43, Otis Gospodnetić
> >>>> > <otis.gospodne...@gmail.com> wrote:
> >>>> >
> >>>> > Hi Mich and Vijay,
> >>>> >
> >>>> > Thanks!  I forgot to include an important bit - I'm looking for a
> >>>> > programmatic way to get Spark metrics when running Spark under YARN
> - so JMX
> >>>> > or API of some kind.
> >>>> >
> >>>> > Thanks,
> >>>> > Otis
> >>>> > --
> >>>> > Monitoring - Log Management - Alerting - Anomaly Detection
> >>>> > Solr & Elasticsearch Consulting Support Training -
> >>>> > http://sematext.com/
> >>>> >
> >>>> >
> >>>> > On Tue, Aug 30, 2016 at 6:59 AM, Mich Talebzadeh
> >>>> > <mich.talebza...@gmail.com> wrote:
> >>>> > Spark UI regardless of deployment mode Standalone, yarn etc runs on
> >>>> > port 4040 by default that can be accessed directly
> >>>> >
> >>>> > Otherwise one can specify a specific port with --conf
> >>>> > "spark.ui.port=55555" for example 55555
> >>>> >
> >>>> > HTH
> >>>> >
> >>>> > Dr Mich Talebzadeh
> >>>> >
> >>>> > LinkedIn
> >>>> > https://www.linkedin.com/profile/view?id=
> AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> >>>> >
> >>>> > http://talebzadehmich.wordpress.com
> >>>> >
> >>>> > Disclaimer: Use it at your own risk. Any and all responsibility for
> >>>> > any loss, damage or destruction of data or any other property which
> may
> >>>> > arise from relying on this email's technical content is explicitly
> >>>> > disclaimed. The author will in no case be liable for any monetary
> damages
> >>>> > arising from such loss, damage or destruction.
> >>>> >
> >>>> >
> >>>> > On 30 August 2016 at 11:48, Vijay Kiran <m...@vijaykiran.com>
> wrote:
> >>>> >
> >>>> > From Yarm RM UI, find the spark application Id, and in the
> application
> >>>> > details, you can click on the “Tracking URL” which should give you
> the Spark
> >>>> > UI.
> >>>> >
> >>>> > ./Vijay
> >>>> >
> >>>> > > On 30 Aug 2016, at 07:53, Otis Gospodnetić
> >>>> > > <otis.gospodne...@gmail.com> wrote:
> >>>> > >
> >>>> > > Hi,
> >>>> > >
> >>>> > > When Spark is run on top of YARN, where/how can one get Spark
> >>>> > > metrics?
> >>>> > >
> >>>> > > Thanks,
> >>>> > > Otis
> >>>> > > --
> >>>> > > Monitoring - Log Management - Alerting - Anomaly Detection
> >>>> > > Solr & Elasticsearch Consulting Support Training -
> >>>> > > http://sematext.com/
> >>>> > >
> >>>> >
> >>>> >
> >>>> > ------------------------------------------------------------
> ---------
> >>>> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> >>>> >
> >>>> >
> >>>> >
> >>>>
> >>>>
> >>>> ---------------------------------------------------------------------
> >>>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> >>>>
> >>>
> >>>
> >>>
> >>> ---------------------------------------------------------------------
> >>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to