You can also get the metrics from the Spark application events log file. See https://www.slideshare.net/JayeshThakrar/apache-bigdata2017sparkprofiling
From: "Qiao, Richard" <richard.q...@capitalone.com> Date: Monday, December 4, 2017 at 6:09 PM To: Nick Dimiduk <ndimi...@gmail.com>, "user@spark.apache.org" <user@spark.apache.org> Subject: Re: Access to Applications metrics It works to collect Job level, through Jolokia java agent. Best Regards Richard From: Nick Dimiduk <ndimi...@gmail.com> Date: Monday, December 4, 2017 at 6:53 PM To: "user@spark.apache.org" <user@spark.apache.org> Subject: Re: Access to Applications metrics Bump. On Wed, Nov 15, 2017 at 2:28 PM, Nick Dimiduk <ndimi...@gmail.com<mailto:ndimi...@gmail.com>> wrote: Hello, I'm wondering if it's possible to get access to the detailed job/stage/task level metrics via the metrics system (JMX, Graphite, &c). I've enabled the wildcard sink and I do not see them. It seems these values are only available over http/json and to SparkListener instances, is this the case? Has anyone worked on a SparkListener that would bridge data from one to the other? Thanks, Nick ________________________________ The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.