I observe that YARN jobs history logs are created in /user/history/done
(*.jhist files) for all the mapreduce jobs like hive, pig etc. But for spark
jobs submitted in yarn-cluster mode, the logs are not being created.

I would like to see resource utilization by spark jobs. Is there any other
place where I can find the resource utilization by spark jobs (CPU, Memory
etc). Or is there any configuration to be set so that the job history logs
are created just like other mapreduce jobs.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Job-History-Logs-for-spark-jobs-submitted-on-YARN-tp25946.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to