Hi,
Spark UI or logs don't provide the situation of cluster. However, you can
use Ganglia to monitor the situation of cluster. In spark-ec2, there is an
option to install ganglia automatically.
If you use CDH, you can also use Cloudera manager.
Cheers
Gen
On Sat, Aug 8, 2015 at 6:06 AM, Xiao
Hi all,
I was running some Hive/spark job on hadoop cluster. I want to see how spark
helps improve not only the elapsed time but also the total CPU consumption.
For Hive, I can get the 'Total MapReduce CPU Time Spent' from the log when the
job finishes. But I didn't find any CPU stats for Spark