[ https://issues.apache.org/jira/browse/SPARK-35623?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17408750#comment-17408750 ]
Senthil Kumar commented on SPARK-35623: --------------------------------------- [~dipanjanK] Include me too pls > Volcano resource manager for Spark on Kubernetes > ------------------------------------------------ > > Key: SPARK-35623 > URL: https://issues.apache.org/jira/browse/SPARK-35623 > Project: Spark > Issue Type: Brainstorming > Components: Kubernetes > Affects Versions: 3.1.1, 3.1.2 > Reporter: Dipanjan Kailthya > Priority: Minor > Labels: kubernetes, resourcemanager > > Dear Spark Developers, > > Hello from the Netherlands! Posting this here as I still haven't gotten > accepted to post in the spark dev mailing list. > > My team is planning to use spark with Kubernetes support on our shared > (multi-tenant) on premise Kubernetes cluster. However we would like to have > certain scheduling features like fair-share and preemption which as we > understand are not built into the current spark-kubernetes resource manager > yet. We have been working on and are close to a first successful prototype > integration with Volcano ([https://volcano.sh/en/docs/]). Briefly this means > a new resource manager component with lots in common with existing > spark-kubernetes resource manager, but instead of pods it launches Volcano > jobs which delegate the driver and executor pod creation and lifecycle > management to Volcano. We are interested in contributing this to open source, > either directly in spark or as a separate project. > > So, two questions: > > 1. Do the spark maintainers see this as a valuable contribution to the > mainline spark codebase? If so, can we have some guidance on how to publish > the changes? > > 2. Are any other developers / organizations interested to contribute to this > effort? If so, please get in touch. > > Best, > Dipanjan -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org