The app ID is assigned internally by spark's task scheduler
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/TaskScheduler.scala#L35.
You could probably change the naming, however I'm pretty sure that the
ID will always have to be unique for a context on a cluster.
Alternatively, could setting the name (conf.setAppName or via
"spark.app.name" config) help with what you're trying to achieve?

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to