Re: Spark metrics when running with YARN?

2016-08-30 Thread Otis Gospodnetić
nd the spark application Id, and in the application >> details, you can click on the “Tracking URL” which should give you the >> Spark UI. >> >> ./Vijay >> >> > On 30 Aug 2016, at 07:53, Otis Gospodnetić <otis.gospodne...@gmail.com> >> wrote: >&g

Spark metrics when running with YARN?

2016-08-29 Thread Otis Gospodnetić
Hi, When Spark is run on top of YARN, where/how can one get Spark metrics? Thanks, Otis -- Monitoring - Log Management - Alerting - Anomaly Detection Solr & Elasticsearch Consulting Support Training - http://sematext.com/

Re: Spark Executor Metrics

2016-08-16 Thread Otis Gospodnetić
Hi Muhammad, You should give people a bit more time to answer/help you (for free). :) I don't have direct answer for you, but you can look at SPM for Spark , which has all the instructions for getting all Spark metrics (Executors,

Re: Apache Flink

2016-04-17 Thread Otis Gospodnetić
While Flink may not be younger than Spark, Spark came to Apache first, which always helps. Plus, there was already a lot of buzz around Spark before it came to Apache. Coming from Berkeley also helps. That said, Flink seems decently healthy to me: -

Re: Monitoring tools for spark streaming

2015-09-29 Thread Otis Gospodnetić
Hi, There's also SPM for Spark -- http://sematext.com/spm/integrations/spark-monitoring.html SPM graphs all Spark metrics and gives you alerting, anomaly detection, etc. and if you ship your Spark and/or other logs to Logsene - http://sematext.com/logsene - you can correlate metrics, logs,

Replacing Esper with Spark Streaming?

2015-09-13 Thread Otis Gospodnetić
Hi, I'm wondering if anyone has attempted to replace Esper with Spark Streaming or if anyone thinks Spark Streaming is/isn't a good tool for the (CEP) job? We are considering Akka or Spark Streaming as possible Esper replacements and would appreciate any input from people who tried to do that

Re: Registering custom metrics

2015-06-23 Thread Otis Gospodnetić
Hi, Not sure if this will fit your needs, but if you are trying to collect+chart some metrics specific to your app, yet want to correlate them with what's going on in Spark, maybe Spark's performance numbers, you may want to send your custom metrics to SPM, so they can be

Re: Monitoring Spark Jobs

2015-06-07 Thread Otis Gospodnetić
Hi Sam, Have a look at Sematext's SPM for your Spark monitoring needs. If the problem is CPU, IO, Network, etc. as Ahkil mentioned, you'll see that in SPM, too. As for the number of jobs running, you have see a chart with that at http://sematext.com/spm/integrations/spark-monitoring.html Otis --