钟文波 Tue, 15 Aug 2017 04:22:40 -0700
How to calculating CPU time for a Spark Job? Is there any interface can be directly call?
like the hadoop Map-Reduce Framework provider the CPU time spent(ms) in the Counters. thinks!