Hi all,
I was running some Hive/spark job on hadoop cluster.  I want to see how spark 
helps improve not only the elapsed time but also the total CPU consumption.
For Hive, I can get the 'Total MapReduce CPU Time Spent' from the log when the 
job finishes. But I didn't find any CPU stats for Spark jobs from either spark 
log or web UI. Is there any place I can find the total CPU consumption for my 
spark job? Thanks!
Here is the version info: Spark version 1.3.0 Using Scala version 2.10.4, Java 
1.7.0_67
Thanks!Xiao                                       

Reply via email to