[ https://issues.apache.org/jira/browse/SPARK-49830?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun resolved SPARK-49830. ----------------------------------- Fix Version/s: kubernetes-operator-0.1.0 Resolution: Fixed Issue resolved by pull request 139 [https://github.com/apache/spark-kubernetes-operator/pull/139] > Fix the error when enable the sparkApplicationSentinel > ------------------------------------------------------- > > Key: SPARK-49830 > URL: https://issues.apache.org/jira/browse/SPARK-49830 > Project: Spark > Issue Type: Sub-task > Components: Kubernetes > Affects Versions: kubernetes-operator-0.1.0 > Reporter: Qi Tan > Assignee: Qi Tan > Priority: Minor > Fix For: kubernetes-operator-0.1.0 > > > When I enable the sparkApplicationSentinel, checking the operator logs with > ERROR as below: > Caused by: io.fabric8.kubernetes.client.KubernetesClientException: Failure > executing: PUT at: > https://10.96.0.1:443/apis/spark.apache.org/v1alpha1/namespaces/spark-4/sparkapplications/spark-app-sentinel. > Message: SparkApplication.spark.apache.org "spark-app-sentinel" is invalid: > spec.runtimeVersions: Required value. Received status: Status(apiVersion=v1, > code=422, > details=StatusDetails(causes=[StatusCause(field=spec.runtimeVersions, > message=Required value, reason=FieldValueRequired, additionalProperties={})], > group=spark.apache.org, kind=SparkApplication, name=spark-app-sentinel, > retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, > message=SparkApplication.spark.apache.org "spark-app-sentinel" is invalid: > spec.runtimeVersions: Required value, metadata=ListMeta(_continue=null, > remainingItemCount=null, resourceVersion=null, selfLink=null, > additionalProperties={}), reason=Invalid, status=Failure, > additionalProperties={}). -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org