Hello team,

I need to present the Spark job performance to my management. I could get
the execution time by measuring the starting and finishing time of the job
(includes overhead). However, not sure how to get the other matrices e.g
cpu, i/o, memory etc..

I want to measure the  individual job, not the whole cluster. Please let me
know the best way to do it. if there are any useful resources the please
provide links.


Thank you.

Reply via email to