I'm interested to see if anyone knows of a way to have custom job/stage name
for Spark Application.
I believe I can use sparkContext.setCallSite(String) to update job/stage
names but it does not let me update each stage name, setting this value will
set same text for all job and stage names for
bump
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Potential-NPE-while-exiting-spark-shell-tp24523p24539.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
I'm currently using Spark 1.3.0 on yarn cluster deployed through CDH5.4. My
cluster does not have a 'default' queue, and launching 'spark-shell' submits
an yarn application that gets killed immediately because queue does not
exist. However, the spark-shell session is still in progress after