hello
use google translate and
https://mkdev.me/posts/ci-i-monitoring-spark-prilozheniy
On Mon, May 16, 2016 at 6:13 PM, Ashok Kumar
wrote:
> Hi,
>
> I would like to know the approach and tools please to get the full
> performance for a Spark app running through
spark + zabbix + jmx
https://translate.google.ru/translate?sl=ru=en=y=_t=en=UTF-8=https%3A%2F%2Fmkdev.me%2Fposts%2Fci-i-monitoring-spark-prilozheniy=
On Mon, May 16, 2016 at 6:13 PM, Ashok Kumar
wrote:
> Hi,
>
> I would like to know the approach and tools please to
> On 30 Dec 2015, at 13:19, alvarobrandon wrote:
>
> Hello:
>
> Is there anyway of monitoring the number of Bytes or blocks read and written
> by an Spark application?. I'm running Spark with YARN and I want to measure
> how I/O intensive a set of applications are.
Hello,
Spark collect HDFS read/write metrics per application/job see details
http://spark.apache.org/docs/latest/monitoring.html.
I have connected spark metrics to Graphite and then doing nice graphs
display on Graphana.
BR,
Arek
On Thu, Dec 31, 2015 at 2:00 PM, Steve Loughran
Hi Sam,
You might want to have a look at spark UI which runs by default at
localhost://8080. You can also configure Apache Ganglia to monitor over your
cluster resources.
Thank you
Regards
Himanshu Mehra
--
View this message in context:
Hi Sam,
Have a look at Sematext's SPM for your Spark monitoring needs. If the
problem is CPU, IO, Network, etc. as Ahkil mentioned, you'll see that in
SPM, too.
As for the number of jobs running, you have see a chart with that at
http://sematext.com/spm/integrations/spark-monitoring.html
Otis
--
It could be a CPU, IO, Network bottleneck, you need to figure out where
exactly its chocking. You can use certain monitoring utilities (like top)
to understand it better.
Thanks
Best Regards
On Sun, Jun 7, 2015 at 4:07 PM, SamyaMaiti samya.maiti2...@gmail.com
wrote:
Hi All,
I have a Spark
Cool, great job☺.
Thanks
Jerry
From: Ryan Williams [mailto:ryan.blake.willi...@gmail.com]
Sent: Thursday, February 26, 2015 6:11 PM
To: user; d...@spark.apache.org
Subject: Monitoring Spark with Graphite and Grafana
If anyone is curious to try exporting Spark metrics to Graphite, I just
If you're only interested in a particular instant, a simpler way is to
check the executors page on the Spark UI:
http://spark.apache.org/docs/latest/monitoring.html. By default each
executor runs one task per core, so you can see how many tasks are being
run at a given time and this translates
Are you running Spark in Local or Standalone mode? In either mode, you
should be able to hit port 4040 (to see the Spark
Jobs/Stages/Storage/Executors UI) on the machine where the driver is
running. However, in local mode, you won't have a Spark Master UI on 7080
or a Worker UI on 7081.
You can
Hi Isca,
I think SPM can do that for you:
http://blog.sematext.com/2014/10/07/apache-spark-monitoring/
Otis
--
Monitoring * Alerting * Anomaly Detection * Centralized Log Management
Solr Elasticsearch Support * http://sematext.com/
On Tue, Dec 2, 2014 at 11:57 PM, Isca Harmatz
11 matches
Mail list logo