Hi,
Please let me elaborate my question so that you will get to know what
exactly I want.
I am running a Spark Streaming job. This job is to count number of
occurrence of the event. Right now I am using a key/value pair RDD
which tells me the count of an event, where key is the event and value
is
Hi,
How we can use JMX and JConsole to monitor our Spark applications?
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org
Have you read this?
https://spark.apache.org/docs/latest/monitoring.html
*Romi Kuntsman*, *Big Data Engineer*
http://www.totango.com
On Thu, Nov 5, 2015 at 2:08 PM, Yogesh Vyas wrote:
> Hi,
> How we can use JMX and JConsole to monitor our Spark applications?
>
>
Hi,
This article may help you. Expose your counter through akka actor
https://tersesystems.com/2014/08/19/exposing-akka-actor-state-with-jmx/
Sent from Mail for Windows 10
From: Yogesh Vyas
Sent: 2015年11月5日 21:21
To: Romi Kuntsman
Cc: user@spark.apache.org
Subject: Re: JMX with Spark
Hi
://apache-spark-user-list.1001560.n3.nabble.com/JMX-with-Spark-tp4309p4823.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Has anyone got this working? I have enabled the properties for it in the
metrics.conf file and ensure that it is placed under spark's home
directory. Any ideas why I don't see spark beans ?
home directory or $home/conf directory? works for me with
metrics.properties hosted under conf dir.
On Tue, Apr 15, 2014 at 6:08 PM, Paul Schooss paulmscho...@gmail.comwrote:
Has anyone got this working? I have enabled the properties for it in the
metrics.conf file and ensure that it is