[ https://issues.apache.org/jira/browse/SPARK-24793?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16541923#comment-16541923 ]
Erik Erlandson commented on SPARK-24793: ---------------------------------------- I am concerned that this is outside the scope of {{spark-submit}}, especially since it is arguably a k8s-centric use case. But it's definitely a useful set of functionality. I'd propose strategic use of labels to make these kind of operations easier via {{kubectl}}. Possibly supported via a tutorial example in the docs? "here's how to use labels to do common operations like "kill this app" and "list all the running driver pods", etc > Make spark-submit more useful with k8s > -------------------------------------- > > Key: SPARK-24793 > URL: https://issues.apache.org/jira/browse/SPARK-24793 > Project: Spark > Issue Type: Improvement > Components: Kubernetes > Affects Versions: 2.3.0 > Reporter: Anirudh Ramanathan > Assignee: Anirudh Ramanathan > Priority: Major > > Support controlling the lifecycle of Spark Application through spark-submit. > For example: > {{ > --kill app_name If given, kills the driver specified. > --status app_name If given, requests the status of the driver > specified. > }} > Potentially also --list to list all spark drivers running. > Given that our submission client can actually launch jobs into many different > namespaces, we'll need an additional specification of the namespace through a > --namespace flag potentially. > I think this is pretty useful to have instead of forcing a user to use > kubectl to manage the lifecycle of any k8s Spark Application. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org