t; *From: *Aurélien Mazoyer
> *Date: *Monday, September 6, 2021 at 5:47 AM
> *To: *Haryani, Akshay
> *Cc: *user@spark.apache.org
> *Subject: *Re: Get application metric from Spark job
>
> Hi Akshay,
>
>
>
> Thank you for your reply. Sounds like a good idea, but I unfortu
& Regards,
Akshay Haryani
From: Aurélien Mazoyer
Date: Monday, September 6, 2021 at 5:47 AM
To: Haryani, Akshay
Cc: user@spark.apache.org
Subject: Re: Get application metric from Spark job
Hi Akshay,
Thank you for your reply. Sounds like a good idea, but I unfortunately have a
2.6 cluster. Do
>
> Thanks & Regards,
>
> Akshay Haryani
>
>
>
> *From: *Aurélien Mazoyer
> *Date: *Thursday, September 2, 2021 at 8:36 AM
> *To: *user@spark.apache.org
> *Subject: *Get application metric from Spark job
>
> Hi community,
>
>
>
> I would like
6 AM
To: user@spark.apache.org
Subject: Get application metric from Spark job
Hi community,
I would like to collect information about the execution of a Spark job while it
is running. Could I define some kind of application metrics (such as a counter
that would be incremented in my code) that I c
Hi community,
I would like to collect information about the execution of a Spark job
while it is running. Could I define some kind of application metrics (such
as a counter that would be incremented in my code) that I could retrieve
regularly while the job is running?
Thank you for help,