[jira] [Created] (SPARK-49052) Add SparkOperator main entry point class
Zhou JIANG created SPARK-49052: -- Summary: Add SparkOperator main entry point class Key: SPARK-49052 URL: https://issues.apache.org/jira/browse/SPARK-49052 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-49045) Add docker image build for operator
Zhou JIANG created SPARK-49045: -- Summary: Add docker image build for operator Key: SPARK-49045 URL: https://issues.apache.org/jira/browse/SPARK-49045 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48984) Add Controller Metrics System and Utils
Zhou JIANG created SPARK-48984: -- Summary: Add Controller Metrics System and Utils Key: SPARK-48984 URL: https://issues.apache.org/jira/browse/SPARK-48984 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48786) Define Config Loading Framework for Spark Operator Controller
[ https://issues.apache.org/jira/browse/SPARK-48786?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Zhou JIANG updated SPARK-48786: --- Summary: Define Config Loading Framework for Spark Operator Controller (was: Define Conf Properties for Spark Operator Controller) > Define Config Loading Framework for Spark Operator Controller > - > > Key: SPARK-48786 > URL: https://issues.apache.org/jira/browse/SPARK-48786 > Project: Spark > Issue Type: Sub-task > Components: k8s >Affects Versions: kubernetes-operator-0.1.0 >Reporter: Zhou JIANG >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48786) Define Conf Properties for Spark Operator Controller
Zhou JIANG created SPARK-48786: -- Summary: Define Conf Properties for Spark Operator Controller Key: SPARK-48786 URL: https://issues.apache.org/jira/browse/SPARK-48786 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48679) Upgrade checkstyle and spotbugs version
Zhou JIANG created SPARK-48679: -- Summary: Upgrade checkstyle and spotbugs version Key: SPARK-48679 URL: https://issues.apache.org/jira/browse/SPARK-48679 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG Upgrade checkstyle/spotbugs versions to latest in operator -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48400) Promote `PrometheusServlet` to `DeveloperApi`
Zhou JIANG created SPARK-48400: -- Summary: Promote `PrometheusServlet` to `DeveloperApi` Key: SPARK-48400 URL: https://issues.apache.org/jira/browse/SPARK-48400 Project: Spark Issue Type: Sub-task Components: Spark Core Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48398) Add Helm chart for Operator Deployment
Zhou JIANG created SPARK-48398: -- Summary: Add Helm chart for Operator Deployment Key: SPARK-48398 URL: https://issues.apache.org/jira/browse/SPARK-48398 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48382) Add controller / reconciler module to operator
Zhou JIANG created SPARK-48382: -- Summary: Add controller / reconciler module to operator Key: SPARK-48382 URL: https://issues.apache.org/jira/browse/SPARK-48382 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48326) Upgrade submission worker base Spark version to 4.0.0-preview2
Zhou JIANG created SPARK-48326: -- Summary: Upgrade submission worker base Spark version to 4.0.0-preview2 Key: SPARK-48326 URL: https://issues.apache.org/jira/browse/SPARK-48326 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48121) Promote ` KubernetesDriverConf` to `DeveloperApi`
Zhou JIANG created SPARK-48121: -- Summary: Promote ` KubernetesDriverConf` to `DeveloperApi` Key: SPARK-48121 URL: https://issues.apache.org/jira/browse/SPARK-48121 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48119) Promote ` KubernetesDriverSpec` to `DeveloperApi`
Zhou JIANG created SPARK-48119: -- Summary: Promote ` KubernetesDriverSpec` to `DeveloperApi` Key: SPARK-48119 URL: https://issues.apache.org/jira/browse/SPARK-48119 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48103) Promote ` KubernetesDriverBuilder` to `DeveloperApi`
Zhou JIANG created SPARK-48103: -- Summary: Promote ` KubernetesDriverBuilder` to `DeveloperApi` Key: SPARK-48103 URL: https://issues.apache.org/jira/browse/SPARK-48103 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48017) Add Spark application submission worker for operator
Zhou JIANG created SPARK-48017: -- Summary: Add Spark application submission worker for operator Key: SPARK-48017 URL: https://issues.apache.org/jira/browse/SPARK-48017 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG Spark Operator needs a submission worker that converts it's application abstraction (Operator API) to k8s resources. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47950) Add Java API Module for Spark Operator
Zhou JIANG created SPARK-47950: -- Summary: Add Java API Module for Spark Operator Key: SPARK-47950 URL: https://issues.apache.org/jira/browse/SPARK-47950 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG Spark Operator API refers to the [CustomResourceDefinition|https://kubernetes.io/docs/tasks/extend-kubernetes/custom-resources/custom-resource-definitions/] __ that represents the spec for Spark Application in k8s. This aims to add Java API library for Spark Operator, with the ability to generate yaml spec. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47943) Add Operator CI Task for Java Build and Test
Zhou JIANG created SPARK-47943: -- Summary: Add Operator CI Task for Java Build and Test Key: SPARK-47943 URL: https://issues.apache.org/jira/browse/SPARK-47943 Project: Spark Issue Type: Sub-task Components: k8s Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG We need to add CI task to build and test Java code for upcoming operator pull requests. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47889) Setup gradle as build tool for operator repository
Zhou JIANG created SPARK-47889: -- Summary: Setup gradle as build tool for operator repository Key: SPARK-47889 URL: https://issues.apache.org/jira/browse/SPARK-47889 Project: Spark Issue Type: Sub-task Components: Kubernetes Affects Versions: kubernetes-operator-0.1.0 Reporter: Zhou JIANG -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47745) Add License to Spark Operator
Zhou JIANG created SPARK-47745: -- Summary: Add License to Spark Operator Key: SPARK-47745 URL: https://issues.apache.org/jira/browse/SPARK-47745 Project: Spark Issue Type: Sub-task Components: Kubernetes Affects Versions: 4.0.0 Reporter: Zhou JIANG Add license to the recently established operator repository. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-45923) SPIP: Spark Kubernetes Operator
Zhou Jiang created SPARK-45923: -- Summary: SPIP: Spark Kubernetes Operator Key: SPARK-45923 URL: https://issues.apache.org/jira/browse/SPARK-45923 Project: Spark Issue Type: New Feature Components: Kubernetes Affects Versions: 4.0.0 Reporter: Zhou Jiang We would like to develop a Java-based Kubernetes operator for Apache Spark. Following the operator pattern (https://kubernetes.io/docs/concepts/extend-kubernetes/operator/), Spark users may manage applications and related components seamlessly using native tools like kubectl. The primary goal is to simplify the Spark user experience on Kubernetes, minimizing the learning curve and operational complexities and therefore enable users to focus on the Spark application development. Ideally, it would reside in a separate repository (like Spark docker or Spark connect golang) and be loosely connected to the Spark release cycle while supporting multiple Spark versions. SPIP doc: [https://docs.google.com/document/d/1f5mm9VpSKeWC72Y9IiKN2jbBn32rHxjWKUfLRaGEcLE|https://docs.google.com/document/d/1f5mm9VpSKeWC72Y9IiKN2jbBn32rHxjWKUfLRaGEcLE/edit#heading=h.hhham7siu2vi] Dev email discussion : [https://lists.apache.org/thread/wdy7jfhf7m8jy74p6s0npjfd15ym5rxz] -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-39100) RDD DeterministicLevel Needs Backward Compatibility
Zhou JIANG created SPARK-39100: -- Summary: RDD DeterministicLevel Needs Backward Compatibility Key: SPARK-39100 URL: https://issues.apache.org/jira/browse/SPARK-39100 Project: Spark Issue Type: Bug Components: Web UI Affects Versions: 3.2.1, 2.4.8 Reporter: Zhou JIANG SPARK-34592 introduces RDD DeterministicLevel - which would be used in Spark UI stage DAG visualization. This is not a fully-backward compatible for History Server. In production, history server (despite its own version), could be responsible to deserialize job / RDD data even for multiple Spark versions. Historical RDD data from previous Spark version do have the new field, and UI rendering for stages / executors tab could crash with - throwable: { [-] class: scala.MatchError msg: null stack: [ [-] org.apache.spark.ui.scope.RDDOperationGraph$.org$apache$spark$ui$scope$RDDOperationGraph$$makeDotNode(RDDOperationGraph.scala:242) org.apache.spark.ui.scope.RDDOperationGraph$$anonfun$org$apache$spark$ui$scope$RDDOperationGraph$$makeDotSubgraph$1.apply(RDDOperationGraph.scala:260) org.apache.spark.ui.scope.RDDOperationGraph$$anonfun$org$apache$spark$ui$scope$RDDOperationGraph$$makeDotSubgraph$1.apply(RDDOperationGraph.scala:259) scala.collection.immutable.Stream.foreach(Stream.scala:594) org.apache.spark.ui.scope.RDDOperationGraph$.org$apache$spark$ui$scope$RDDOperationGraph$$makeDotSubgraph(RDDOperationGraph.scala:259) org.apache.spark.ui.scope.RDDOperationGraph$$anonfun$org$apache$spark$ui$scope$RDDOperationGraph$$makeDotSubgraph$2.apply(RDDOperationGraph.scala:263) org.apache.spark.ui.scope.RDDOperationGraph$$anonfun$org$apache$spark$ui$scope$RDDOperationGraph$$makeDotSubgraph$2.apply(RDDOperationGraph.scala:262) scala.collection.immutable.Stream.foreach(Stream.scala:594) org.apache.spark.ui.scope.RDDOperationGraph$.org$apache$spark$ui$scope$RDDOperationGraph$$makeDotSubgraph(RDDOperationGraph.scala:262) org.apache.spark.ui.scope.RDDOperationGraph$.makeDotFile(RDDOperationGraph.scala:227) org.apache.spark.ui.UIUtils$$anonfun$showDagViz$1.apply(UIUtils.scala:433) org.apache.spark.ui.UIUtils$$anonfun$showDagViz$1.apply(UIUtils.scala:429) scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) scala.collection.immutable.List.foreach(List.scala:392) scala.collection.TraversableLike$class.map(TraversableLike.scala:234) scala.collection.immutable.List.map(List.scala:296) org.apache.spark.ui.UIUtils$.showDagViz(UIUtils.scala:429) org.apache.spark.ui.UIUtils$.showDagVizForStage(UIUtils.scala:401) org.apache.spark.ui.jobs.StagePage.render(StagePage.scala:257) org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:90) org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:90) org.apache.spark.ui.JettyUtils$$anon$3.doGet(JettyUtils.scala:90) javax.servlet.http.HttpServlet.service(HttpServlet.java:687) javax.servlet.http.HttpServlet.service(HttpServlet.java:790) org.spark_project.jetty.servlet.ServletHolder.handle(ServletHolder.java:791) org.spark_project.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1626) pie.spark.ui.filter.PIEAuthFilter.doFilter(PIEAuthFilter.scala:89) org.spark_project.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) org.spark_project.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601) org.apache.spark.deploy.history.ApplicationCacheCheckFilter.doFilter(ApplicationCache.scala:405) org.spark_project.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) org.spark_project.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601) org.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:548) org.spark_project.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) org.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1435) org.spark_project.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) org.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:501) org.spark_project.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) org.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1350) org.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
[jira] [Created] (SPARK-34316) Optional Propagation of SPARK_CONF_DIR in K8s
Zhou JIANG created SPARK-34316: -- Summary: Optional Propagation of SPARK_CONF_DIR in K8s Key: SPARK-34316 URL: https://issues.apache.org/jira/browse/SPARK-34316 Project: Spark Issue Type: New Feature Components: Kubernetes Affects Versions: 3.0.1 Reporter: Zhou JIANG In shared Kubernetes clusters, Spark could be restricted from creating and deleting config maps in job namespaces. It would be helpful if the current mandatory config map creation could be optional. User may still take responsibility of handing Spark conf files separately. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org