Integration about submitting and monitoring spark tasks

2020-03-24 Thread jianl miao
I now need to integrate spark into our own platform built with spring to reflect the ability of task submission and task monitoring. Spark tasks run on yarn and are in cluster mode. And our current service may submit tasks to different yarn clusters. According to the current method provided

Re: Monitoring Spark application progress

2016-05-16 Thread Василец Дмитрий
hello use google translate and https://mkdev.me/posts/ci-i-monitoring-spark-prilozheniy On Mon, May 16, 2016 at 6:13 PM, Ashok Kumar <ashok34...@yahoo.com.invalid> wrote: > Hi, > > I would like to know the approach and tools please to get the full > performance for a Spark a

Re: Monitoring Spark application progress

2016-05-16 Thread Василец Дмитрий
spark + zabbix + jmx https://translate.google.ru/translate?sl=ru=en=y=_t=en=UTF-8=https%3A%2F%2Fmkdev.me%2Fposts%2Fci-i-monitoring-spark-prilozheniy= On Mon, May 16, 2016 at 6:13 PM, Ashok Kumar <ashok34...@yahoo.com.invalid> wrote: > Hi, > > I would like to know the approach a

Monitoring Spark application progress

2016-05-16 Thread Ashok Kumar
Hi, I would like to know the approach and tools please to get the full performance for a Spark app running through Spark-shell and Spark-sumbit - Through Spark GUI at 4040? - Through OS utilities top, SAR  - Through Java tools like jbuilder etc - Through integration Spark with

Monitoring Spark with Ganglia on ElCapo

2016-01-17 Thread william tellme
Does anyone have a link handy that describes configuring Ganglia on the mac? - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org

Re: Monitoring Spark HDFS Reads and Writes

2015-12-31 Thread Steve Loughran
> On 30 Dec 2015, at 13:19, alvarobrandon wrote: > > Hello: > > Is there anyway of monitoring the number of Bytes or blocks read and written > by an Spark application?. I'm running Spark with YARN and I want to measure > how I/O intensive a set of applications are.

Re: Monitoring Spark HDFS Reads and Writes

2015-12-31 Thread Arkadiusz Bicz
Hello, Spark collect HDFS read/write metrics per application/job see details http://spark.apache.org/docs/latest/monitoring.html. I have connected spark metrics to Graphite and then doing nice graphs display on Graphana. BR, Arek On Thu, Dec 31, 2015 at 2:00 PM, Steve Loughran

Monitoring Spark HDFS Reads and Writes

2015-12-30 Thread alvarobrandon
in advance Best Regards. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Monitoring-Spark-HDFS-Reads-and-Writes-tp25838.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Monitoring Spark Jobs

2015-06-10 Thread Himanshu Mehra
/Monitoring-Spark-Jobs-tp23193p23243.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org

Re: Monitoring Spark Jobs

2015-06-07 Thread Otis Gospodnetić
the number of cores in the cluster. I suspect there is a bottleneck somewhere else. Regards, Sam -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Monitoring-Spark-Jobs-tp23193.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Monitoring Spark Jobs

2015-06-07 Thread SamyaMaiti
the average response time increasing with increase in number of requests, in-spite of increasing the number of cores in the cluster. I suspect there is a bottleneck somewhere else. Regards, Sam -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Monitoring-Spark-Jobs

Re: Monitoring Spark Jobs

2015-06-07 Thread Akhil Das
increasing with increase in number of requests, in-spite of increasing the number of cores in the cluster. I suspect there is a bottleneck somewhere else. Regards, Sam -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Monitoring-Spark-Jobs-tp23193.html

Monitoring Spark with Graphite and Grafana

2015-02-26 Thread Ryan Williams
If anyone is curious to try exporting Spark metrics to Graphite, I just published a post about my experience doing that, building dashboards in Grafana http://grafana.org/, and using them to monitor Spark jobs: http://www.hammerlab.org/2015/02/27/monitoring-spark-with-graphite-and-grafana/ Code

RE: Monitoring Spark with Graphite and Grafana

2015-02-26 Thread Shao, Saisai
Cool, great job☺. Thanks Jerry From: Ryan Williams [mailto:ryan.blake.willi...@gmail.com] Sent: Thursday, February 26, 2015 6:11 PM To: user; d...@spark.apache.org Subject: Monitoring Spark with Graphite and Grafana If anyone is curious to try exporting Spark metrics to Graphite, I just

Re: Monitoring Spark

2014-12-05 Thread Andrew Or
If you're only interested in a particular instant, a simpler way is to check the executors page on the Spark UI: http://spark.apache.org/docs/latest/monitoring.html. By default each executor runs one task per core, so you can see how many tasks are being run at a given time and this translates

Re: Monitoring Spark

2014-12-04 Thread Sameer Farooqui
Are you running Spark in Local or Standalone mode? In either mode, you should be able to hit port 4040 (to see the Spark Jobs/Stages/Storage/Executors UI) on the machine where the driver is running. However, in local mode, you won't have a Spark Master UI on 7080 or a Worker UI on 7081. You can

Monitoring Spark

2014-12-03 Thread Isca Harmatz
hello, im running spark on stand alone station and im try to view the event log after the run is finished i turned on the event log as the site said (spark.eventLog.enabled set to true) but i can't find the log files or get the web ui to work. any idea on how to do this? thanks Isca

Monitoring Spark

2014-12-02 Thread Isca Harmatz
hello, im running spark on a cluster and i want to monitor how many nodes/ cores are active in different (specific) points of the program. is there any way to do this? thanks, Isca

Monitoring Spark

2014-12-02 Thread Isca Harmatz
hello, im running spark on a cluster and i want to monitor how many nodes/ cores are active in different (specific) points of the program. is there any way to do this? thanks, Isca

Re: Monitoring Spark

2014-12-02 Thread Otis Gospodnetic
Hi Isca, I think SPM can do that for you: http://blog.sematext.com/2014/10/07/apache-spark-monitoring/ Otis -- Monitoring * Alerting * Anomaly Detection * Centralized Log Management Solr Elasticsearch Support * http://sematext.com/ On Tue, Dec 2, 2014 at 11:57 PM, Isca Harmatz

Monitoring spark dis-associated workers

2014-06-10 Thread Allen Chang
such a health check? Thanks, Allen -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Monitoring-spark-dis-associated-workers-tp7358.html Sent from the Apache Spark User List mailing list archive at Nabble.com.