[jira] [Resolved] (SPARK-48606) Upgrade `google-java-format` to 1.22.0
[ https://issues.apache.org/jira/browse/SPARK-48606?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48606. --- Fix Version/s: kubernetes-operator-0.1.0 Resolution: Fixed Issue resolved by pull request 15 [https://github.com/apache/spark-kubernetes-operator/pull/15] > Upgrade `google-java-format` to 1.22.0 > -- > > Key: SPARK-48606 > URL: https://issues.apache.org/jira/browse/SPARK-48606 > Project: Spark > Issue Type: Improvement > Components: Kubernetes >Affects Versions: kubernetes-operator-0.1.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > Fix For: kubernetes-operator-0.1.0 > > > This issue aims to upgrade `google-java-format` plugin of Spark Kubernetes > Operator repository to bring the latest bug fixes like the following. The > latest version is recommended. > {code} > java.lang.Exception: google-java-format 1.17.0 is currently being used, but > outdated. > google-java-format 1.19.2 is the recommended version, which may have fixed > this problem. > google-java-format 1.19.2 requires JVM 11+. > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48606) Upgrade `google-java-format` to 1.22.0
[ https://issues.apache.org/jira/browse/SPARK-48606?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48606: - Assignee: Dongjoon Hyun > Upgrade `google-java-format` to 1.22.0 > -- > > Key: SPARK-48606 > URL: https://issues.apache.org/jira/browse/SPARK-48606 > Project: Spark > Issue Type: Improvement > Components: Kubernetes >Affects Versions: kubernetes-operator-0.1.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > > This issue aims to upgrade `google-java-format` plugin of Spark Kubernetes > Operator repository to bring the latest bug fixes like the following. The > latest version is recommended. > {code} > java.lang.Exception: google-java-format 1.17.0 is currently being used, but > outdated. > google-java-format 1.19.2 is the recommended version, which may have fixed > this problem. > google-java-format 1.19.2 requires JVM 11+. > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48606) Upgrade `google-java-format` to 1.22.0
Dongjoon Hyun created SPARK-48606: - Summary: Upgrade `google-java-format` to 1.22.0 Key: SPARK-48606 URL: https://issues.apache.org/jira/browse/SPARK-48606 Project: Spark Issue Type: Improvement Components: Kubernetes Affects Versions: kubernetes-operator-0.1.0 Reporter: Dongjoon Hyun This issue aims to upgrade `google-java-format` plugin of Spark Kubernetes Operator repository to bring the latest bug fixes like the following. The latest version is recommended. {code} java.lang.Exception: google-java-format 1.17.0 is currently being used, but outdated. google-java-format 1.19.2 is the recommended version, which may have fixed this problem. google-java-format 1.19.2 requires JVM 11+. {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48554) Use R 4.4.0 in `windows` R GitHub Action Window job
[ https://issues.apache.org/jira/browse/SPARK-48554?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48554: -- Summary: Use R 4.4.0 in `windows` R GitHub Action Window job (was: Use R 4.4.0 in `windows` R GitHub Action job) > Use R 4.4.0 in `windows` R GitHub Action Window job > --- > > Key: SPARK-48554 > URL: https://issues.apache.org/jira/browse/SPARK-48554 > Project: Spark > Issue Type: Improvement > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48554) Use R 4.4.0 in `windows` R GitHub Action Windows job
[ https://issues.apache.org/jira/browse/SPARK-48554?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48554: -- Summary: Use R 4.4.0 in `windows` R GitHub Action Windows job (was: Use R 4.4.0 in `windows` R GitHub Action Window job) > Use R 4.4.0 in `windows` R GitHub Action Windows job > > > Key: SPARK-48554 > URL: https://issues.apache.org/jira/browse/SPARK-48554 > Project: Spark > Issue Type: Improvement > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48528) Refine K8s Operator `merge_spark_pr.py` to use `kubernetes-operator-x.y.z` versions
[ https://issues.apache.org/jira/browse/SPARK-48528?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48528. --- Fix Version/s: kubernetes-operator-0.1.0 Assignee: Dongjoon Hyun Resolution: Fixed This is resolved via https://github.com/apache/spark-kubernetes-operator/pull/14 > Refine K8s Operator `merge_spark_pr.py` to use `kubernetes-operator-x.y.z` > versions > --- > > Key: SPARK-48528 > URL: https://issues.apache.org/jira/browse/SPARK-48528 > Project: Spark > Issue Type: Improvement > Components: Kubernetes >Affects Versions: kubernetes-operator-0.1.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > Fix For: kubernetes-operator-0.1.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48531) Fix `Black` target version to Python 3.9
[ https://issues.apache.org/jira/browse/SPARK-48531?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48531. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46867 [https://github.com/apache/spark/pull/46867] > Fix `Black` target version to Python 3.9 > > > Key: SPARK-48531 > URL: https://issues.apache.org/jira/browse/SPARK-48531 > Project: Spark > Issue Type: Sub-task > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48531) Fix `Black` target version to Python 3.9
[ https://issues.apache.org/jira/browse/SPARK-48531?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48531: - Assignee: Dongjoon Hyun > Fix `Black` target version to Python 3.9 > > > Key: SPARK-48531 > URL: https://issues.apache.org/jira/browse/SPARK-48531 > Project: Spark > Issue Type: Sub-task > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48531) Fix `Black` target version to Python 3.9
[ https://issues.apache.org/jira/browse/SPARK-48531?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48531: -- Parent: SPARK-44111 Issue Type: Sub-task (was: Improvement) > Fix `Black` target version to Python 3.9 > > > Key: SPARK-48531 > URL: https://issues.apache.org/jira/browse/SPARK-48531 > Project: Spark > Issue Type: Sub-task > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Priority: Minor > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48531) Fix `Black` target version to Python 3.9
Dongjoon Hyun created SPARK-48531: - Summary: Fix `Black` target version to Python 3.9 Key: SPARK-48531 URL: https://issues.apache.org/jira/browse/SPARK-48531 Project: Spark Issue Type: Improvement Components: Project Infra Affects Versions: 4.0.0 Reporter: Dongjoon Hyun -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48528) Refine K8s Operator `merge_spark_pr.py` to use `kubernetes-operator-x.y.z` versions
[ https://issues.apache.org/jira/browse/SPARK-48528?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48528: -- Summary: Refine K8s Operator `merge_spark_pr.py` to use `kubernetes-operator-x.y.z` versions (was: Refine K8s Operator `merge_spark_pr.py` to use `kubernetes-operator-x.y.z` version only) > Refine K8s Operator `merge_spark_pr.py` to use `kubernetes-operator-x.y.z` > versions > --- > > Key: SPARK-48528 > URL: https://issues.apache.org/jira/browse/SPARK-48528 > Project: Spark > Issue Type: Improvement > Components: Kubernetes >Affects Versions: kubernetes-operator-0.1.0 >Reporter: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48382) Add controller / reconciler module to operator
[ https://issues.apache.org/jira/browse/SPARK-48382?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48382: - Assignee: Zhou JIANG > Add controller / reconciler module to operator > -- > > Key: SPARK-48382 > URL: https://issues.apache.org/jira/browse/SPARK-48382 > Project: Spark > Issue Type: Sub-task > Components: k8s >Affects Versions: kubernetes-operator-0.1.0 >Reporter: Zhou JIANG >Assignee: Zhou JIANG >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48528) Refine K8s Operator `merge_spark_pr.py` to use `kubernetes-operator-x.y.z` version only
Dongjoon Hyun created SPARK-48528: - Summary: Refine K8s Operator `merge_spark_pr.py` to use `kubernetes-operator-x.y.z` version only Key: SPARK-48528 URL: https://issues.apache.org/jira/browse/SPARK-48528 Project: Spark Issue Type: Improvement Components: Kubernetes Affects Versions: kubernetes-operator-0.1.0 Reporter: Dongjoon Hyun -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48326) Use the official Apache Spark 4.0.0-preview1
[ https://issues.apache.org/jira/browse/SPARK-48326?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48326: -- Fix Version/s: kubernetes-operator-0.1.0 (was: 4.0.0) > Use the official Apache Spark 4.0.0-preview1 > > > Key: SPARK-48326 > URL: https://issues.apache.org/jira/browse/SPARK-48326 > Project: Spark > Issue Type: Sub-task > Components: k8s >Affects Versions: kubernetes-operator-0.1.0 >Reporter: Zhou JIANG >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > Fix For: kubernetes-operator-0.1.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48326) Use the official Apache Spark 4.0.0-preview1
[ https://issues.apache.org/jira/browse/SPARK-48326?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48326: -- Summary: Use the official Apache Spark 4.0.0-preview1 (was: Upgrade submission worker base Spark version to 4.0.0-preview2) > Use the official Apache Spark 4.0.0-preview1 > > > Key: SPARK-48326 > URL: https://issues.apache.org/jira/browse/SPARK-48326 > Project: Spark > Issue Type: Sub-task > Components: k8s >Affects Versions: kubernetes-operator-0.1.0 >Reporter: Zhou JIANG >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48326) Upgrade submission worker base Spark version to 4.0.0-preview2
[ https://issues.apache.org/jira/browse/SPARK-48326?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48326: - Assignee: Dongjoon Hyun > Upgrade submission worker base Spark version to 4.0.0-preview2 > -- > > Key: SPARK-48326 > URL: https://issues.apache.org/jira/browse/SPARK-48326 > Project: Spark > Issue Type: Sub-task > Components: k8s >Affects Versions: kubernetes-operator-0.1.0 >Reporter: Zhou JIANG >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48326) Upgrade submission worker base Spark version to 4.0.0-preview2
[ https://issues.apache.org/jira/browse/SPARK-48326?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48326. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 13 [https://github.com/apache/spark-kubernetes-operator/pull/13] > Upgrade submission worker base Spark version to 4.0.0-preview2 > -- > > Key: SPARK-48326 > URL: https://issues.apache.org/jira/browse/SPARK-48326 > Project: Spark > Issue Type: Sub-task > Components: k8s >Affects Versions: kubernetes-operator-0.1.0 >Reporter: Zhou JIANG >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48394) Cleanup mapIdToMapIndex on mapoutput unregister
[ https://issues.apache.org/jira/browse/SPARK-48394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48394. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46706 [https://github.com/apache/spark/pull/46706] > Cleanup mapIdToMapIndex on mapoutput unregister > --- > > Key: SPARK-48394 > URL: https://issues.apache.org/jira/browse/SPARK-48394 > Project: Spark > Issue Type: Improvement > Components: Spark Core >Affects Versions: 3.5.0, 4.0.0, 3.5.1 >Reporter: wuyi >Assignee: wuyi >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > There is only one valid mapstatus for the same {{mapIndex}} at the same time > in Spark. {{mapIdToMapIndex}} should also follows the same rule to avoid > chaos. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48394) Cleanup mapIdToMapIndex on mapoutput unregister
[ https://issues.apache.org/jira/browse/SPARK-48394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48394: - Assignee: wuyi > Cleanup mapIdToMapIndex on mapoutput unregister > --- > > Key: SPARK-48394 > URL: https://issues.apache.org/jira/browse/SPARK-48394 > Project: Spark > Issue Type: Improvement > Components: Spark Core >Affects Versions: 3.5.0, 4.0.0, 3.5.1 >Reporter: wuyi >Assignee: wuyi >Priority: Major > Labels: pull-request-available > > There is only one valid mapstatus for the same {{mapIndex}} at the same time > in Spark. {{mapIdToMapIndex}} should also follows the same rule to avoid > chaos. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48407) Teradata: Document Type Conversion rules between Spark SQL and teradata
[ https://issues.apache.org/jira/browse/SPARK-48407?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48407. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46728 [https://github.com/apache/spark/pull/46728] > Teradata: Document Type Conversion rules between Spark SQL and teradata > --- > > Key: SPARK-48407 > URL: https://issues.apache.org/jira/browse/SPARK-48407 > Project: Spark > Issue Type: Sub-task > Components: Documentation, SQL >Affects Versions: 4.0.0 >Reporter: Kent Yao >Assignee: Kent Yao >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48325) Always specify messages in ExecutorRunner.killProcess
[ https://issues.apache.org/jira/browse/SPARK-48325?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48325. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46641 [https://github.com/apache/spark/pull/46641] > Always specify messages in ExecutorRunner.killProcess > - > > Key: SPARK-48325 > URL: https://issues.apache.org/jira/browse/SPARK-48325 > Project: Spark > Issue Type: Improvement > Components: Spark Core >Affects Versions: 4.0.0 >Reporter: Bo Zhang >Assignee: Bo Zhang >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > For some of the cases in ExecutorRunner.killProcess, the argument `message` > is `None`. We should always specify the message so that we can get the > occurrence rate for different cases, in order to analyze executor running > stability. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48381) Update `YuniKorn` docs with v1.5.1
[ https://issues.apache.org/jira/browse/SPARK-48381?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48381. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46690 [https://github.com/apache/spark/pull/46690] > Update `YuniKorn` docs with v1.5.1 > -- > > Key: SPARK-48381 > URL: https://issues.apache.org/jira/browse/SPARK-48381 > Project: Spark > Issue Type: Sub-task > Components: Documentation, Kubernetes >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48381) Update `YuniKorn` docs with v1.5.1
[ https://issues.apache.org/jira/browse/SPARK-48381?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48381: - Assignee: Dongjoon Hyun > Update `YuniKorn` docs with v1.5.1 > -- > > Key: SPARK-48381 > URL: https://issues.apache.org/jira/browse/SPARK-48381 > Project: Spark > Issue Type: Sub-task > Components: Documentation, Kubernetes >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48381) Update `YuniKorn` docs with v1.5.1
Dongjoon Hyun created SPARK-48381: - Summary: Update `YuniKorn` docs with v1.5.1 Key: SPARK-48381 URL: https://issues.apache.org/jira/browse/SPARK-48381 Project: Spark Issue Type: Sub-task Components: Documentation, Kubernetes Affects Versions: 4.0.0 Reporter: Dongjoon Hyun -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48329) Enable `spark.sql.sources.v2.bucketing.pushPartValues.enabled` by default
[ https://issues.apache.org/jira/browse/SPARK-48329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48329: -- Summary: Enable `spark.sql.sources.v2.bucketing.pushPartValues.enabled` by default (was: Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to true) > Enable `spark.sql.sources.v2.bucketing.pushPartValues.enabled` by default > - > > Key: SPARK-48329 > URL: https://issues.apache.org/jira/browse/SPARK-48329 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 4.0.0 >Reporter: Szehon Ho >Assignee: Szehon Ho >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > > The SPJ feature flag 'spark.sql.sources.v2.bucketing.pushPartValues.enabled' > has proven valuable for most use cases. We should take advantage of 4.0 > release and change the value to true. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48329) Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to true
[ https://issues.apache.org/jira/browse/SPARK-48329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48329: -- Parent Issue: SPARK-44111 (was: SPARK-37375) > Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to true > - > > Key: SPARK-48329 > URL: https://issues.apache.org/jira/browse/SPARK-48329 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 4.0.0 >Reporter: Szehon Ho >Assignee: Szehon Ho >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > > The SPJ feature flag 'spark.sql.sources.v2.bucketing.pushPartValues.enabled' > has proven valuable for most use cases. We should take advantage of 4.0 > release and change the value to true. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48329) Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to true
[ https://issues.apache.org/jira/browse/SPARK-48329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48329. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46673 [https://github.com/apache/spark/pull/46673] > Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to true > - > > Key: SPARK-48329 > URL: https://issues.apache.org/jira/browse/SPARK-48329 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 4.0.0 >Reporter: Szehon Ho >Assignee: Szehon Ho >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > > The SPJ feature flag 'spark.sql.sources.v2.bucketing.pushPartValues.enabled' > has proven valuable for most use cases. We should take advantage of 4.0 > release and change the value to true. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48329) Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to true
[ https://issues.apache.org/jira/browse/SPARK-48329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48329: - Assignee: Szehon Ho > Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to true > - > > Key: SPARK-48329 > URL: https://issues.apache.org/jira/browse/SPARK-48329 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 4.0.0 >Reporter: Szehon Ho >Assignee: Szehon Ho >Priority: Minor > Labels: pull-request-available > > The SPJ feature flag 'spark.sql.sources.v2.bucketing.pushPartValues.enabled' > has proven valuable for most use cases. We should take advantage of 4.0 > release and change the value to true. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48328) Upgrade `Arrow` to 16.1.0
[ https://issues.apache.org/jira/browse/SPARK-48328?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48328. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46646 [https://github.com/apache/spark/pull/46646] > Upgrade `Arrow` to 16.1.0 > - > > Key: SPARK-48328 > URL: https://issues.apache.org/jira/browse/SPARK-48328 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48017) Add Spark application submission worker for operator
[ https://issues.apache.org/jira/browse/SPARK-48017?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48017: - Assignee: Zhou JIANG > Add Spark application submission worker for operator > > > Key: SPARK-48017 > URL: https://issues.apache.org/jira/browse/SPARK-48017 > Project: Spark > Issue Type: Sub-task > Components: k8s >Affects Versions: kubernetes-operator-0.1.0 >Reporter: Zhou JIANG >Assignee: Zhou JIANG >Priority: Major > Labels: pull-request-available > > Spark Operator needs a submission worker that converts it's application > abstraction (Operator API) to k8s resources. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48017) Add Spark application submission worker for operator
[ https://issues.apache.org/jira/browse/SPARK-48017?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48017. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 10 [https://github.com/apache/spark-kubernetes-operator/pull/10] > Add Spark application submission worker for operator > > > Key: SPARK-48017 > URL: https://issues.apache.org/jira/browse/SPARK-48017 > Project: Spark > Issue Type: Sub-task > Components: k8s >Affects Versions: kubernetes-operator-0.1.0 >Reporter: Zhou JIANG >Assignee: Zhou JIANG >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > Spark Operator needs a submission worker that converts it's application > abstraction (Operator API) to k8s resources. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48256) Add a rule to check file headers for the java side, and fix inconsistent files
[ https://issues.apache.org/jira/browse/SPARK-48256?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48256. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46557 [https://github.com/apache/spark/pull/46557] > Add a rule to check file headers for the java side, and fix inconsistent files > -- > > Key: SPARK-48256 > URL: https://issues.apache.org/jira/browse/SPARK-48256 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48256) Add a rule to check file headers for the java side, and fix inconsistent files
[ https://issues.apache.org/jira/browse/SPARK-48256?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48256: - Assignee: BingKun Pan > Add a rule to check file headers for the java side, and fix inconsistent files > -- > > Key: SPARK-48256 > URL: https://issues.apache.org/jira/browse/SPARK-48256 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48218) TransportClientFactory.createClient may NPE cause FetchFailedException
[ https://issues.apache.org/jira/browse/SPARK-48218?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48218: - Assignee: dzcxzl > TransportClientFactory.createClient may NPE cause FetchFailedException > -- > > Key: SPARK-48218 > URL: https://issues.apache.org/jira/browse/SPARK-48218 > Project: Spark > Issue Type: Improvement > Components: Shuffle >Affects Versions: 4.0.0 >Reporter: dzcxzl >Assignee: dzcxzl >Priority: Minor > Labels: pull-request-available > > {code:java} > org.apache.spark.shuffle.FetchFailedException > at > org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:1180) > at > org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:913) > at > org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:84) > at > org.apache.spark.util.CompletionIterator.next(CompletionIterator.scala:29) > Caused by: java.lang.NullPointerException > at > org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:178) > at > org.apache.spark.network.shuffle.ExternalBlockStoreClient.lambda$fetchBlocks$0(ExternalBlockStoreClient.java:128) > at > org.apache.spark.network.shuffle.RetryingBlockTransferor.transferAllOutstanding(RetryingBlockTransferor.java:154) > at > org.apache.spark.network.shuffle.RetryingBlockTransferor.start(RetryingBlockTransferor.java:133) > at > org.apache.spark.network.shuffle.ExternalBlockStoreClient.fetchBlocks(ExternalBlockStoreClient.java:139) > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48218) TransportClientFactory.createClient may NPE cause FetchFailedException
[ https://issues.apache.org/jira/browse/SPARK-48218?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48218. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46506 [https://github.com/apache/spark/pull/46506] > TransportClientFactory.createClient may NPE cause FetchFailedException > -- > > Key: SPARK-48218 > URL: https://issues.apache.org/jira/browse/SPARK-48218 > Project: Spark > Issue Type: Improvement > Components: Shuffle >Affects Versions: 4.0.0 >Reporter: dzcxzl >Assignee: dzcxzl >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > > {code:java} > org.apache.spark.shuffle.FetchFailedException > at > org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:1180) > at > org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:913) > at > org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:84) > at > org.apache.spark.util.CompletionIterator.next(CompletionIterator.scala:29) > Caused by: java.lang.NullPointerException > at > org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:178) > at > org.apache.spark.network.shuffle.ExternalBlockStoreClient.lambda$fetchBlocks$0(ExternalBlockStoreClient.java:128) > at > org.apache.spark.network.shuffle.RetryingBlockTransferor.transferAllOutstanding(RetryingBlockTransferor.java:154) > at > org.apache.spark.network.shuffle.RetryingBlockTransferor.start(RetryingBlockTransferor.java:133) > at > org.apache.spark.network.shuffle.ExternalBlockStoreClient.fetchBlocks(ExternalBlockStoreClient.java:139) > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48049) Upgrade Scala to 2.13.14
[ https://issues.apache.org/jira/browse/SPARK-48049?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48049. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46288 [https://github.com/apache/spark/pull/46288] > Upgrade Scala to 2.13.14 > > > Key: SPARK-48049 > URL: https://issues.apache.org/jira/browse/SPARK-48049 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48285) Update docs for size function and sizeOfNull configuration
[ https://issues.apache.org/jira/browse/SPARK-48285?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48285. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46592 [https://github.com/apache/spark/pull/46592] > Update docs for size function and sizeOfNull configuration > -- > > Key: SPARK-48285 > URL: https://issues.apache.org/jira/browse/SPARK-48285 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 4.0.0 >Reporter: Kent Yao >Assignee: Kent Yao >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-48238) Spark fail to start due to class o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter
[ https://issues.apache.org/jira/browse/SPARK-48238?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17846701#comment-17846701 ] Dongjoon Hyun commented on SPARK-48238: --- Hi, [~chengpan] and [~HF] and [~cloud_fan]. Is this true that we need to revert SPARK-45522 and SPARK-47118 for only YARN support? Do you think there is an alternative like we did for Hadoop 2 and Hadoop 3 support or Hive 1 and Hive 2 support? For example, can we isolate Jetty issues to YARN module and JettyUtil via configurations? > Spark fail to start due to class > o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter > --- > > Key: SPARK-48238 > URL: https://issues.apache.org/jira/browse/SPARK-48238 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Priority: Blocker > > I tested the latest master branch, it failed to start on YARN mode > {code:java} > dev/make-distribution.sh --tgz -Phive,hive-thriftserver,yarn{code} > > {code:java} > $ bin/spark-sql --master yarn > WARNING: Using incubator modules: jdk.incubator.vector > Setting default log level to "WARN". > To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use > setLogLevel(newLevel). > 2024-05-10 17:58:17 WARN NativeCodeLoader: Unable to load native-hadoop > library for your platform... using builtin-java classes where applicable > 2024-05-10 17:58:18 WARN Client: Neither spark.yarn.jars nor > spark.yarn.archive} is set, falling back to uploading libraries under > SPARK_HOME. > 2024-05-10 17:58:25 ERROR SparkContext: Error initializing SparkContext. > org.sparkproject.jetty.util.MultiException: Multiple exceptions > at > org.sparkproject.jetty.util.MultiException.ifExceptionThrow(MultiException.java:117) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:751) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:392) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:902) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:306) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:93) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:514) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:81) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:81) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) > ~[scala-library-2.13.13.jar:?] > at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) > ~[scala-library-2.13.13.jar:?] > at scala.collection.AbstractIterable.foreach(Iterable.scala:935) > ~[scala-library-2.13.13.jar:?] > at > org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:81) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:79) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?] > at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:79) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at org.apache.spark.SparkContext.$anonfun$new$31(SparkContext.scala:690) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.SparkContext.$anonfun$new$31$adapted(SparkContext.scala:690) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?] > at org.apache.spark.SparkContext.(SparkContext.scala:690) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2963) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:1118) > ~[spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at scala.Option.getOrElse(Option.scala:201) [scala-library-2.13.13.jar:?] > at >
[jira] [Updated] (SPARK-48238) Spark fail to start due to class o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter
[ https://issues.apache.org/jira/browse/SPARK-48238?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48238: -- Description: I tested the latest master branch, it failed to start on YARN mode {code:java} dev/make-distribution.sh --tgz -Phive,hive-thriftserver,yarn{code} {code:java} $ bin/spark-sql --master yarn WARNING: Using incubator modules: jdk.incubator.vector Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 2024-05-10 17:58:17 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2024-05-10 17:58:18 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive} is set, falling back to uploading libraries under SPARK_HOME. 2024-05-10 17:58:25 ERROR SparkContext: Error initializing SparkContext. org.sparkproject.jetty.util.MultiException: Multiple exceptions at org.sparkproject.jetty.util.MultiException.ifExceptionThrow(MultiException.java:117) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:751) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:392) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:902) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:306) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:93) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:514) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:81) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:81) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) ~[scala-library-2.13.13.jar:?] at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) ~[scala-library-2.13.13.jar:?] at scala.collection.AbstractIterable.foreach(Iterable.scala:935) ~[scala-library-2.13.13.jar:?] at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:81) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:79) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?] at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:79) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.spark.SparkContext.$anonfun$new$31(SparkContext.scala:690) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.spark.SparkContext.$anonfun$new$31$adapted(SparkContext.scala:690) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?] at org.apache.spark.SparkContext.(SparkContext.scala:690) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2963) ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:1118) ~[spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at scala.Option.getOrElse(Option.scala:201) [scala-library-2.13.13.jar:?] at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:1112) [spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:64) [spark-hive-thriftserver_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.(SparkSQLCLIDriver.scala:405) [spark-hive-thriftserver_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:162) [spark-hive-thriftserver_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala) [spark-hive-thriftserver_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?] at
[jira] [Resolved] (SPARK-48279) Upgrade ORC to 2.0.1
[ https://issues.apache.org/jira/browse/SPARK-48279?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48279. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46587 [https://github.com/apache/spark/pull/46587] > Upgrade ORC to 2.0.1 > > > Key: SPARK-48279 > URL: https://issues.apache.org/jira/browse/SPARK-48279 > Project: Spark > Issue Type: Bug > Components: Build >Affects Versions: 4.0.0 >Reporter: William Hyun >Assignee: William Hyun >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48231) Remove unused CodeHaus Jackson dependencies
[ https://issues.apache.org/jira/browse/SPARK-48231?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48231: -- Parent: (was: SPARK-47046) Issue Type: Bug (was: Sub-task) > Remove unused CodeHaus Jackson dependencies > --- > > Key: SPARK-48231 > URL: https://issues.apache.org/jira/browse/SPARK-48231 > Project: Spark > Issue Type: Bug > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48230) Remove unused jodd-core
[ https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48230: -- Parent: (was: SPARK-47046) Issue Type: Bug (was: Sub-task) > Remove unused jodd-core > --- > > Key: SPARK-48230 > URL: https://issues.apache.org/jira/browse/SPARK-48230 > Project: Spark > Issue Type: Bug > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48237) After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be deleted
[ https://issues.apache.org/jira/browse/SPARK-48237?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48237: -- Issue Type: Bug (was: Improvement) > After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be > deleted > > > Key: SPARK-48237 > URL: https://issues.apache.org/jira/browse/SPARK-48237 > Project: Spark > Issue Type: Bug > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0, 3.5.2, 3.4.4 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48237) After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be deleted
[ https://issues.apache.org/jira/browse/SPARK-48237?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48237: - Assignee: BingKun Pan > After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be > deleted > > > Key: SPARK-48237 > URL: https://issues.apache.org/jira/browse/SPARK-48237 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48237) After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be deleted
[ https://issues.apache.org/jira/browse/SPARK-48237?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48237. --- Fix Version/s: 3.4.4 3.5.2 4.0.0 Resolution: Fixed Issue resolved by pull request 46531 [https://github.com/apache/spark/pull/46531] > After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be > deleted > > > Key: SPARK-48237 > URL: https://issues.apache.org/jira/browse/SPARK-48237 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 3.4.4, 3.5.2, 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48236) Add `commons-lang:commons-lang:2.6` back to support legacy Hive UDF jars
[ https://issues.apache.org/jira/browse/SPARK-48236?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48236. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46528 [https://github.com/apache/spark/pull/46528] > Add `commons-lang:commons-lang:2.6` back to support legacy Hive UDF jars > > > Key: SPARK-48236 > URL: https://issues.apache.org/jira/browse/SPARK-48236 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Critical > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-48230) Remove unused jodd-core
[ https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17845522#comment-17845522 ] Dongjoon Hyun commented on SPARK-48230: --- We will revisit this dependency. > Remove unused jodd-core > --- > > Key: SPARK-48230 > URL: https://issues.apache.org/jira/browse/SPARK-48230 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48230) Remove unused jodd-core
[ https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48230: -- Fix Version/s: (was: 4.0.0) > Remove unused jodd-core > --- > > Key: SPARK-48230 > URL: https://issues.apache.org/jira/browse/SPARK-48230 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48144) canPlanAsBroadcastHashJoin should respect shuffle join hints
[ https://issues.apache.org/jira/browse/SPARK-48144?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48144. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46401 [https://github.com/apache/spark/pull/46401] > canPlanAsBroadcastHashJoin should respect shuffle join hints > > > Key: SPARK-48144 > URL: https://issues.apache.org/jira/browse/SPARK-48144 > Project: Spark > Issue Type: Bug > Components: Optimizer >Affects Versions: 4.0.0, 3.5.2, 3.4.4 >Reporter: Fredrik Klauß >Assignee: Fredrik Klauß >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > Currently, `canPlanAsBroadcastHashJoin` incorrectly returns that a join can > be planned as a BHJ, even though the join contains a SHJ. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48144) canPlanAsBroadcastHashJoin should respect shuffle join hints
[ https://issues.apache.org/jira/browse/SPARK-48144?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48144: - Assignee: Fredrik Klauß > canPlanAsBroadcastHashJoin should respect shuffle join hints > > > Key: SPARK-48144 > URL: https://issues.apache.org/jira/browse/SPARK-48144 > Project: Spark > Issue Type: Bug > Components: Optimizer >Affects Versions: 4.0.0, 3.5.2, 3.4.4 >Reporter: Fredrik Klauß >Assignee: Fredrik Klauß >Priority: Major > Labels: pull-request-available > > Currently, `canPlanAsBroadcastHashJoin` incorrectly returns that a join can > be planned as a BHJ, even though the join contains a SHJ. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-47441) Do not add log link for unmanaged AM in Spark UI
[ https://issues.apache.org/jira/browse/SPARK-47441?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-47441: - Assignee: Yuming Wang > Do not add log link for unmanaged AM in Spark UI > > > Key: SPARK-47441 > URL: https://issues.apache.org/jira/browse/SPARK-47441 > Project: Spark > Issue Type: Bug > Components: YARN >Affects Versions: 3.5.0, 3.5.1 >Reporter: Yuming Wang >Assignee: Yuming Wang >Priority: Major > Labels: pull-request-available > > {noformat} > 24/03/18 04:58:25,022 ERROR [spark-listener-group-appStatus] > scheduler.AsyncEventQueue:97 : Listener AppStatusListener threw an exception > java.lang.NumberFormatException: For input string: "null" > at > java.lang.NumberFormatException.forInputString(NumberFormatException.java:67) > ~[?:?] > at java.lang.Integer.parseInt(Integer.java:668) ~[?:?] > at java.lang.Integer.parseInt(Integer.java:786) ~[?:?] > at scala.collection.immutable.StringLike.toInt(StringLike.scala:310) > ~[scala-library-2.12.18.jar:?] > at scala.collection.immutable.StringLike.toInt$(StringLike.scala:310) > ~[scala-library-2.12.18.jar:?] > at scala.collection.immutable.StringOps.toInt(StringOps.scala:33) > ~[scala-library-2.12.18.jar:?] > at org.apache.spark.util.Utils$.parseHostPort(Utils.scala:1105) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.status.ProcessSummaryWrapper.(storeTypes.scala:609) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.status.LiveMiscellaneousProcess.doUpdate(LiveEntity.scala:1045) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at org.apache.spark.status.LiveEntity.write(LiveEntity.scala:50) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.status.AppStatusListener.update(AppStatusListener.scala:1233) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.status.AppStatusListener.onMiscellaneousProcessAdded(AppStatusListener.scala:1445) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.status.AppStatusListener.onOtherEvent(AppStatusListener.scala:113) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:100) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) > ~[scala-library-2.12.18.jar:?] > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) > ~[scala-library-2.12.18.jar:?] > at > org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1356) > [spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96) > [spark-core_2.12-3.5.1.jar:3.5.1] > {noformat} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47441) Do not add log link for unmanaged AM in Spark UI
[ https://issues.apache.org/jira/browse/SPARK-47441?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-47441. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 45565 [https://github.com/apache/spark/pull/45565] > Do not add log link for unmanaged AM in Spark UI > > > Key: SPARK-47441 > URL: https://issues.apache.org/jira/browse/SPARK-47441 > Project: Spark > Issue Type: Bug > Components: YARN >Affects Versions: 3.5.0, 3.5.1 >Reporter: Yuming Wang >Assignee: Yuming Wang >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > {noformat} > 24/03/18 04:58:25,022 ERROR [spark-listener-group-appStatus] > scheduler.AsyncEventQueue:97 : Listener AppStatusListener threw an exception > java.lang.NumberFormatException: For input string: "null" > at > java.lang.NumberFormatException.forInputString(NumberFormatException.java:67) > ~[?:?] > at java.lang.Integer.parseInt(Integer.java:668) ~[?:?] > at java.lang.Integer.parseInt(Integer.java:786) ~[?:?] > at scala.collection.immutable.StringLike.toInt(StringLike.scala:310) > ~[scala-library-2.12.18.jar:?] > at scala.collection.immutable.StringLike.toInt$(StringLike.scala:310) > ~[scala-library-2.12.18.jar:?] > at scala.collection.immutable.StringOps.toInt(StringOps.scala:33) > ~[scala-library-2.12.18.jar:?] > at org.apache.spark.util.Utils$.parseHostPort(Utils.scala:1105) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.status.ProcessSummaryWrapper.(storeTypes.scala:609) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.status.LiveMiscellaneousProcess.doUpdate(LiveEntity.scala:1045) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at org.apache.spark.status.LiveEntity.write(LiveEntity.scala:50) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.status.AppStatusListener.update(AppStatusListener.scala:1233) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.status.AppStatusListener.onMiscellaneousProcessAdded(AppStatusListener.scala:1445) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.status.AppStatusListener.onOtherEvent(AppStatusListener.scala:113) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:100) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) > ~[scala-library-2.12.18.jar:?] > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) > ~[scala-library-2.12.18.jar:?] > at > org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96) > ~[spark-core_2.12-3.5.1.jar:3.5.1] > at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1356) > [spark-core_2.12-3.5.1.jar:3.5.1] > at > org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96) > [spark-core_2.12-3.5.1.jar:3.5.1] > {noformat} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide
[ https://issues.apache.org/jira/browse/SPARK-48235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48235: - Assignee: Fredrik Klauß (was: Dongjoon Hyun) > Directly pass join instead of all arguments to getBroadcastBuildSide and > getShuffleHashJoinBuildSide > > > Key: SPARK-48235 > URL: https://issues.apache.org/jira/browse/SPARK-48235 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Fredrik Klauß >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide
[ https://issues.apache.org/jira/browse/SPARK-48235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48235: -- Reporter: Fredrik Klauß (was: Dongjoon Hyun) > Directly pass join instead of all arguments to getBroadcastBuildSide and > getShuffleHashJoinBuildSide > > > Key: SPARK-48235 > URL: https://issues.apache.org/jira/browse/SPARK-48235 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 4.0.0 >Reporter: Fredrik Klauß >Assignee: Fredrik Klauß >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide
[ https://issues.apache.org/jira/browse/SPARK-48235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48235. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46525 [https://github.com/apache/spark/pull/46525] > Directly pass join instead of all arguments to getBroadcastBuildSide and > getShuffleHashJoinBuildSide > > > Key: SPARK-48235 > URL: https://issues.apache.org/jira/browse/SPARK-48235 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide
[ https://issues.apache.org/jira/browse/SPARK-48235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48235: - Assignee: Dongjoon Hyun > Directly pass join instead of all arguments to getBroadcastBuildSide and > getShuffleHashJoinBuildSide > > > Key: SPARK-48235 > URL: https://issues.apache.org/jira/browse/SPARK-48235 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide
Dongjoon Hyun created SPARK-48235: - Summary: Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide Key: SPARK-48235 URL: https://issues.apache.org/jira/browse/SPARK-48235 Project: Spark Issue Type: Improvement Components: SQL Affects Versions: 4.0.0 Reporter: Dongjoon Hyun -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48230) Remove unused jodd-core
[ https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48230. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46520 [https://github.com/apache/spark/pull/46520] > Remove unused jodd-core > --- > > Key: SPARK-48230 > URL: https://issues.apache.org/jira/browse/SPARK-48230 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48231) Remove unused CodeHaus Jackson dependencies
[ https://issues.apache.org/jira/browse/SPARK-48231?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48231: - Assignee: Cheng Pan > Remove unused CodeHaus Jackson dependencies > --- > > Key: SPARK-48231 > URL: https://issues.apache.org/jira/browse/SPARK-48231 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48230) Remove unused jodd-core
[ https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48230: - Assignee: Cheng Pan > Remove unused jodd-core > --- > > Key: SPARK-48230 > URL: https://issues.apache.org/jira/browse/SPARK-48230 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48231) Remove unused CodeHaus Jackson dependencies
[ https://issues.apache.org/jira/browse/SPARK-48231?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48231: -- Parent: SPARK-47046 Issue Type: Sub-task (was: Improvement) > Remove unused CodeHaus Jackson dependencies > --- > > Key: SPARK-48231 > URL: https://issues.apache.org/jira/browse/SPARK-48231 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47847) Deprecate spark.network.remoteReadNioBufferConversion
[ https://issues.apache.org/jira/browse/SPARK-47847?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-47847. --- Fix Version/s: 3.5.2 4.0.0 Resolution: Fixed Issue resolved by pull request 46047 [https://github.com/apache/spark/pull/46047] > Deprecate spark.network.remoteReadNioBufferConversion > - > > Key: SPARK-47847 > URL: https://issues.apache.org/jira/browse/SPARK-47847 > Project: Spark > Issue Type: Improvement > Components: Shuffle, Spark Core >Affects Versions: 3.5.2 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > Fix For: 3.5.2, 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-47847) Deprecate spark.network.remoteReadNioBufferConversion
[ https://issues.apache.org/jira/browse/SPARK-47847?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-47847: - Assignee: Cheng Pan > Deprecate spark.network.remoteReadNioBufferConversion > - > > Key: SPARK-47847 > URL: https://issues.apache.org/jira/browse/SPARK-47847 > Project: Spark > Issue Type: Improvement > Components: Shuffle, Spark Core >Affects Versions: 3.5.2 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-47847) Deprecate spark.network.remoteReadNioBufferConversion
[ https://issues.apache.org/jira/browse/SPARK-47847?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-47847: -- Parent: SPARK-44111 Issue Type: Sub-task (was: Improvement) > Deprecate spark.network.remoteReadNioBufferConversion > - > > Key: SPARK-47847 > URL: https://issues.apache.org/jira/browse/SPARK-47847 > Project: Spark > Issue Type: Sub-task > Components: Shuffle, Spark Core >Affects Versions: 3.5.2 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0, 3.5.2 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48230) Remove unused jodd-core
[ https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48230: -- Parent: SPARK-47046 Issue Type: Sub-task (was: Improvement) > Remove unused jodd-core > --- > > Key: SPARK-48230 > URL: https://issues.apache.org/jira/browse/SPARK-48230 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance
[ https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48094: -- Description: h2. ASF INFRA POLICY - [https://infra.apache.org/github-actions-policy.html] h2. MONITORING - [https://infra-reports.apache.org/#ghactions=spark=168] !Screenshot 2024-05-02 at 23.56.05.png|width=100%! h2. TARGET * All workflows MUST have a job concurrency level less than or equal to 20. This means a workflow cannot have more than 20 jobs running at the same time across all matrices. * All workflows SHOULD have a job concurrency level less than or equal to 15. Just because 20 is the max, doesn't mean you should strive for 20. * The average number of minutes a project uses per calendar week MUST NOT exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 hours). * The average number of minutes a project uses in any consecutive five-day period MUST NOT exceed the equivalent of 30 full-time runners (216,000 minutes, or 3,600 hours). h2. DEADLINE {quote}17th of May, 2024 {quote} was: h2. ASF INFRA POLICY - [https://infra.apache.org/github-actions-policy.html] h2. MONITORING - [https://infra-reports.apache.org/#ghactions=spark=168] !Screenshot 2024-05-02 at 23.56.05.png|width=100! h2. TARGET * All workflows MUST have a job concurrency level less than or equal to 20. This means a workflow cannot have more than 20 jobs running at the same time across all matrices. * All workflows SHOULD have a job concurrency level less than or equal to 15. Just because 20 is the max, doesn't mean you should strive for 20. * The average number of minutes a project uses per calendar week MUST NOT exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 hours). * The average number of minutes a project uses in any consecutive five-day period MUST NOT exceed the equivalent of 30 full-time runners (216,000 minutes, or 3,600 hours). h2. DEADLINE {quote}17th of May, 2024 {quote} > Reduce GitHub Action usage according to ASF project allowance > - > > Key: SPARK-48094 > URL: https://issues.apache.org/jira/browse/SPARK-48094 > Project: Spark > Issue Type: Umbrella > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Priority: Major > Attachments: Screenshot 2024-05-02 at 23.56.05.png > > > h2. ASF INFRA POLICY > - [https://infra.apache.org/github-actions-policy.html] > h2. MONITORING > - [https://infra-reports.apache.org/#ghactions=spark=168] > !Screenshot 2024-05-02 at 23.56.05.png|width=100%! > h2. TARGET > * All workflows MUST have a job concurrency level less than or equal to 20. > This means a workflow cannot have more than 20 jobs running at the same time > across all matrices. > * All workflows SHOULD have a job concurrency level less than or equal to > 15. Just because 20 is the max, doesn't mean you should strive for 20. > * The average number of minutes a project uses per calendar week MUST NOT > exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 > hours). > * The average number of minutes a project uses in any consecutive five-day > period MUST NOT exceed the equivalent of 30 full-time runners (216,000 > minutes, or 3,600 hours). > h2. DEADLINE > {quote}17th of May, 2024 > {quote} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48201) Docstrings of the pyspark DataStream Reader methods are inaccurate
[ https://issues.apache.org/jira/browse/SPARK-48201?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48201. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46416 [https://github.com/apache/spark/pull/46416] > Docstrings of the pyspark DataStream Reader methods are inaccurate > -- > > Key: SPARK-48201 > URL: https://issues.apache.org/jira/browse/SPARK-48201 > Project: Spark > Issue Type: Documentation > Components: PySpark >Affects Versions: 3.4.3 >Reporter: Chloe He >Assignee: Chloe He >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > > The docstrings of the pyspark DataStream Reader methods {{csv()}} and > {{text()}} say that the {{path}} parameter can be a list, but actually when a > list is passed an error is raised. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48201) Docstrings of the pyspark DataStream Reader methods are inaccurate
[ https://issues.apache.org/jira/browse/SPARK-48201?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48201: - Assignee: Chloe He > Docstrings of the pyspark DataStream Reader methods are inaccurate > -- > > Key: SPARK-48201 > URL: https://issues.apache.org/jira/browse/SPARK-48201 > Project: Spark > Issue Type: Documentation > Components: PySpark >Affects Versions: 3.4.3 >Reporter: Chloe He >Assignee: Chloe He >Priority: Minor > Labels: pull-request-available > > The docstrings of the pyspark DataStream Reader methods {{csv()}} and > {{text()}} say that the {{path}} parameter can be a list, but actually when a > list is passed an error is raised. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48228) Implement the missing function validation in ApplyInXXX
[ https://issues.apache.org/jira/browse/SPARK-48228?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48228. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46519 [https://github.com/apache/spark/pull/46519] > Implement the missing function validation in ApplyInXXX > --- > > Key: SPARK-48228 > URL: https://issues.apache.org/jira/browse/SPARK-48228 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 4.0.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48228) Implement the missing function validation in ApplyInXXX
[ https://issues.apache.org/jira/browse/SPARK-48228?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48228: - Assignee: Ruifeng Zheng > Implement the missing function validation in ApplyInXXX > --- > > Key: SPARK-48228 > URL: https://issues.apache.org/jira/browse/SPARK-48228 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark >Affects Versions: 4.0.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48224) Disable variant from being a part of a map key
[ https://issues.apache.org/jira/browse/SPARK-48224?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48224. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46516 [https://github.com/apache/spark/pull/46516] > Disable variant from being a part of a map key > -- > > Key: SPARK-48224 > URL: https://issues.apache.org/jira/browse/SPARK-48224 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 4.0.0 >Reporter: Harsh Motwani >Assignee: Harsh Motwani >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > Creating a map object with a variant key currently works. However, this > behavior should be disabled. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48224) Disable variant from being a part of a map key
[ https://issues.apache.org/jira/browse/SPARK-48224?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48224: - Assignee: Harsh Motwani > Disable variant from being a part of a map key > -- > > Key: SPARK-48224 > URL: https://issues.apache.org/jira/browse/SPARK-48224 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 4.0.0 >Reporter: Harsh Motwani >Assignee: Harsh Motwani >Priority: Major > Labels: pull-request-available > > Creating a map object with a variant key currently works. However, this > behavior should be disabled. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48163) Disable `SparkConnectServiceSuite.SPARK-43923: commands send events - get_resources_command`
[ https://issues.apache.org/jira/browse/SPARK-48163?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48163: -- Parent: (was: SPARK-44111) Issue Type: Bug (was: Sub-task) > Disable `SparkConnectServiceSuite.SPARK-43923: commands send events - > get_resources_command` > > > Key: SPARK-48163 > URL: https://issues.apache.org/jira/browse/SPARK-48163 > Project: Spark > Issue Type: Bug > Components: SQL, Tests >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > > {code} > - SPARK-43923: commands send events ((get_resources_command { > [info] } > [info] ,None)) *** FAILED *** (35 milliseconds) > [info] VerifyEvents.this.listener.executeHolder.isDefined was false > (SparkConnectServiceSuite.scala:873) > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Closed] (SPARK-37626) Upgrade libthrift to 0.15.0
[ https://issues.apache.org/jira/browse/SPARK-37626?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun closed SPARK-37626. - > Upgrade libthrift to 0.15.0 > --- > > Key: SPARK-37626 > URL: https://issues.apache.org/jira/browse/SPARK-37626 > Project: Spark > Issue Type: Bug > Components: Build >Affects Versions: 3.3.0 >Reporter: Bo Zhang >Priority: Major > > Upgrade libthrift to 1.15.0 in order to avoid > https://nvd.nist.gov/vuln/detail/CVE-2020-13949. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47018) Upgrade built-in Hive to 2.3.10
[ https://issues.apache.org/jira/browse/SPARK-47018?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-47018. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46468 [https://github.com/apache/spark/pull/46468] > Upgrade built-in Hive to 2.3.10 > --- > > Key: SPARK-47018 > URL: https://issues.apache.org/jira/browse/SPARK-47018 > Project: Spark > Issue Type: Sub-task > Components: Build, SQL >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-47834) Mark deprecated functions with `@deprecated` in `SQLImplicits`
[ https://issues.apache.org/jira/browse/SPARK-47834?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-47834: -- Parent: SPARK-44111 Issue Type: Sub-task (was: Improvement) > Mark deprecated functions with `@deprecated` in `SQLImplicits` > -- > > Key: SPARK-47834 > URL: https://issues.apache.org/jira/browse/SPARK-47834 > Project: Spark > Issue Type: Sub-task > Components: Connect, SQL >Affects Versions: 4.0.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47834) Mark deprecated functions with `@deprecated` in `SQLImplicits`
[ https://issues.apache.org/jira/browse/SPARK-47834?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-47834. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46029 [https://github.com/apache/spark/pull/46029] > Mark deprecated functions with `@deprecated` in `SQLImplicits` > -- > > Key: SPARK-47834 > URL: https://issues.apache.org/jira/browse/SPARK-47834 > Project: Spark > Issue Type: Improvement > Components: Connect, SQL >Affects Versions: 4.0.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48227) Document the requirement of seed in protos
[ https://issues.apache.org/jira/browse/SPARK-48227?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48227. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46518 [https://github.com/apache/spark/pull/46518] > Document the requirement of seed in protos > -- > > Key: SPARK-48227 > URL: https://issues.apache.org/jira/browse/SPARK-48227 > Project: Spark > Issue Type: Improvement > Components: Documentation, PySpark >Affects Versions: 4.0.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48227) Document the requirement of seed in protos
[ https://issues.apache.org/jira/browse/SPARK-48227?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48227: - Assignee: Ruifeng Zheng > Document the requirement of seed in protos > -- > > Key: SPARK-48227 > URL: https://issues.apache.org/jira/browse/SPARK-48227 > Project: Spark > Issue Type: Improvement > Components: Documentation, PySpark >Affects Versions: 4.0.0 >Reporter: Ruifeng Zheng >Assignee: Ruifeng Zheng >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48226) Add `spark-ganglia-lgpl` to `lint-java` & `spark-ganglia-lgpl` and `jvm-profiler` to `sbt-checkstyle`
[ https://issues.apache.org/jira/browse/SPARK-48226?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48226. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46501 [https://github.com/apache/spark/pull/46501] > Add `spark-ganglia-lgpl` to `lint-java` & `spark-ganglia-lgpl` and > `jvm-profiler` to `sbt-checkstyle` > - > > Key: SPARK-48226 > URL: https://issues.apache.org/jira/browse/SPARK-48226 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48226) Add `spark-ganglia-lgpl` to `lint-java` & `spark-ganglia-lgpl` and `jvm-profiler` to `sbt-checkstyle`
[ https://issues.apache.org/jira/browse/SPARK-48226?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48226: - Assignee: BingKun Pan > Add `spark-ganglia-lgpl` to `lint-java` & `spark-ganglia-lgpl` and > `jvm-profiler` to `sbt-checkstyle` > - > > Key: SPARK-48226 > URL: https://issues.apache.org/jira/browse/SPARK-48226 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48225) Upgrade `sbt` to 1.10.0
Dongjoon Hyun created SPARK-48225: - Summary: Upgrade `sbt` to 1.10.0 Key: SPARK-48225 URL: https://issues.apache.org/jira/browse/SPARK-48225 Project: Spark Issue Type: Improvement Components: Build Affects Versions: 4.0.0 Reporter: Dongjoon Hyun -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Closed] (SPARK-48164) Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - get_resources_command`
[ https://issues.apache.org/jira/browse/SPARK-48164?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun closed SPARK-48164. - > Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - > get_resources_command` > -- > > Key: SPARK-48164 > URL: https://issues.apache.org/jira/browse/SPARK-48164 > Project: Spark > Issue Type: Bug > Components: Connect, Tests >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48164) Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - get_resources_command`
[ https://issues.apache.org/jira/browse/SPARK-48164?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48164: -- Parent: (was: SPARK-44111) Issue Type: Bug (was: Sub-task) > Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - > get_resources_command` > -- > > Key: SPARK-48164 > URL: https://issues.apache.org/jira/browse/SPARK-48164 > Project: Spark > Issue Type: Bug > Components: Connect, Tests >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48164) Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - get_resources_command`
[ https://issues.apache.org/jira/browse/SPARK-48164?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48164: -- Priority: Major (was: Blocker) > Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - > get_resources_command` > -- > > Key: SPARK-48164 > URL: https://issues.apache.org/jira/browse/SPARK-48164 > Project: Spark > Issue Type: Sub-task > Components: Connect, Tests >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-47930) Upgrade RoaringBitmap to 1.0.6
[ https://issues.apache.org/jira/browse/SPARK-47930?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-47930: -- Parent: SPARK-47046 Issue Type: Sub-task (was: Improvement) > Upgrade RoaringBitmap to 1.0.6 > -- > > Key: SPARK-47930 > URL: https://issues.apache.org/jira/browse/SPARK-47930 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-47982) Update code style' plugins to latest version
[ https://issues.apache.org/jira/browse/SPARK-47982?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-47982: -- Parent: SPARK-47046 Issue Type: Sub-task (was: Improvement) > Update code style' plugins to latest version > > > Key: SPARK-47982 > URL: https://issues.apache.org/jira/browse/SPARK-47982 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48216) Remove overrides DockerJDBCIntegrationSuite.connectionTimeout to make related tests configurable
[ https://issues.apache.org/jira/browse/SPARK-48216?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48216. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46505 [https://github.com/apache/spark/pull/46505] > Remove overrides DockerJDBCIntegrationSuite.connectionTimeout to make related > tests configurable > > > Key: SPARK-48216 > URL: https://issues.apache.org/jira/browse/SPARK-48216 > Project: Spark > Issue Type: Sub-task > Components: Spark Docker, Tests >Affects Versions: 4.0.0 >Reporter: Kent Yao >Assignee: Kent Yao >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48216) Remove overrides DockerJDBCIntegrationSuite.connectionTimeout to make related tests configurable
[ https://issues.apache.org/jira/browse/SPARK-48216?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48216: - Assignee: Kent Yao > Remove overrides DockerJDBCIntegrationSuite.connectionTimeout to make related > tests configurable > > > Key: SPARK-48216 > URL: https://issues.apache.org/jira/browse/SPARK-48216 > Project: Spark > Issue Type: Sub-task > Components: Spark Docker, Tests >Affects Versions: 4.0.0 >Reporter: Kent Yao >Assignee: Kent Yao >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance
[ https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48094: -- Description: h2. ASF INFRA POLICY - [https://infra.apache.org/github-actions-policy.html] h2. MONITORING - [https://infra-reports.apache.org/#ghactions=spark=168] !Screenshot 2024-05-02 at 23.56.05.png|width=100! h2. TARGET * All workflows MUST have a job concurrency level less than or equal to 20. This means a workflow cannot have more than 20 jobs running at the same time across all matrices. * All workflows SHOULD have a job concurrency level less than or equal to 15. Just because 20 is the max, doesn't mean you should strive for 20. * The average number of minutes a project uses per calendar week MUST NOT exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 hours). * The average number of minutes a project uses in any consecutive five-day period MUST NOT exceed the equivalent of 30 full-time runners (216,000 minutes, or 3,600 hours). h2. DEADLINE {quote}17th of May, 2024 {quote} was: h2. ASF INFRA POLICY - https://infra.apache.org/github-actions-policy.html h2. MONITORING - https://infra-reports.apache.org/#ghactions=spark=168 !Screenshot 2024-05-02 at 23.56.05.png|width=100%! h2. TARGET * All workflows MUST have a job concurrency level less than or equal to 20. This means a workflow cannot have more than 20 jobs running at the same time across all matrices. * All workflows SHOULD have a job concurrency level less than or equal to 15. Just because 20 is the max, doesn't mean you should strive for 20. * The average number of minutes a project uses per calendar week MUST NOT exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 hours). * The average number of minutes a project uses in any consecutive five-day period MUST NOT exceed the equivalent of 30 full-time runners (216,000 minutes, or 3,600 hours). h2. DEADLINE bq. 17th of May, 2024 Since the deadline is 17th of May, 2024, I set this as the highest priority, `Blocker`. > Reduce GitHub Action usage according to ASF project allowance > - > > Key: SPARK-48094 > URL: https://issues.apache.org/jira/browse/SPARK-48094 > Project: Spark > Issue Type: Umbrella > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Priority: Major > Attachments: Screenshot 2024-05-02 at 23.56.05.png > > > h2. ASF INFRA POLICY > - [https://infra.apache.org/github-actions-policy.html] > h2. MONITORING > - [https://infra-reports.apache.org/#ghactions=spark=168] > !Screenshot 2024-05-02 at 23.56.05.png|width=100! > h2. TARGET > * All workflows MUST have a job concurrency level less than or equal to 20. > This means a workflow cannot have more than 20 jobs running at the same time > across all matrices. > * All workflows SHOULD have a job concurrency level less than or equal to > 15. Just because 20 is the max, doesn't mean you should strive for 20. > * The average number of minutes a project uses per calendar week MUST NOT > exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 > hours). > * The average number of minutes a project uses in any consecutive five-day > period MUST NOT exceed the equivalent of 30 full-time runners (216,000 > minutes, or 3,600 hours). > h2. DEADLINE > {quote}17th of May, 2024 > {quote} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance
[ https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48094: -- Priority: Major (was: Blocker) > Reduce GitHub Action usage according to ASF project allowance > - > > Key: SPARK-48094 > URL: https://issues.apache.org/jira/browse/SPARK-48094 > Project: Spark > Issue Type: Umbrella > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Priority: Major > Attachments: Screenshot 2024-05-02 at 23.56.05.png > > > h2. ASF INFRA POLICY > - https://infra.apache.org/github-actions-policy.html > h2. MONITORING > - https://infra-reports.apache.org/#ghactions=spark=168 > !Screenshot 2024-05-02 at 23.56.05.png|width=100%! > h2. TARGET > * All workflows MUST have a job concurrency level less than or equal to 20. > This means a workflow cannot have more than 20 jobs running at the same time > across all matrices. > * All workflows SHOULD have a job concurrency level less than or equal to 15. > Just because 20 is the max, doesn't mean you should strive for 20. > * The average number of minutes a project uses per calendar week MUST NOT > exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 > hours). > * The average number of minutes a project uses in any consecutive five-day > period MUST NOT exceed the equivalent of 30 full-time runners (216,000 > minutes, or 3,600 hours). > h2. DEADLINE > bq. 17th of May, 2024 > Since the deadline is 17th of May, 2024, I set this as the highest priority, > `Blocker`. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance
[ https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48094: -- Fix Version/s: (was: 4.0.0) > Reduce GitHub Action usage according to ASF project allowance > - > > Key: SPARK-48094 > URL: https://issues.apache.org/jira/browse/SPARK-48094 > Project: Spark > Issue Type: Umbrella > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Priority: Blocker > Attachments: Screenshot 2024-05-02 at 23.56.05.png > > > h2. ASF INFRA POLICY > - https://infra.apache.org/github-actions-policy.html > h2. MONITORING > - https://infra-reports.apache.org/#ghactions=spark=168 > !Screenshot 2024-05-02 at 23.56.05.png|width=100%! > h2. TARGET > * All workflows MUST have a job concurrency level less than or equal to 20. > This means a workflow cannot have more than 20 jobs running at the same time > across all matrices. > * All workflows SHOULD have a job concurrency level less than or equal to 15. > Just because 20 is the max, doesn't mean you should strive for 20. > * The average number of minutes a project uses per calendar week MUST NOT > exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 > hours). > * The average number of minutes a project uses in any consecutive five-day > period MUST NOT exceed the equivalent of 30 full-time runners (216,000 > minutes, or 3,600 hours). > h2. DEADLINE > bq. 17th of May, 2024 > Since the deadline is 17th of May, 2024, I set this as the highest priority, > `Blocker`. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48187) Run `docs` only in PR builders and `build_non_ansi` Daily CI
[ https://issues.apache.org/jira/browse/SPARK-48187?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48187: -- Fix Version/s: 4.0.0 > Run `docs` only in PR builders and `build_non_ansi` Daily CI > > > Key: SPARK-48187 > URL: https://issues.apache.org/jira/browse/SPARK-48187 > Project: Spark > Issue Type: Sub-task > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-48187) Run `docs` only in PR builders and `build_non_ansi` Daily CI
[ https://issues.apache.org/jira/browse/SPARK-48187?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun updated SPARK-48187: -- Fix Version/s: (was: 4.0.0) > Run `docs` only in PR builders and `build_non_ansi` Daily CI > > > Key: SPARK-48187 > URL: https://issues.apache.org/jira/browse/SPARK-48187 > Project: Spark > Issue Type: Sub-task > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Reopened] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance
[ https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reopened SPARK-48094: --- Assignee: (was: Dongjoon Hyun) > Reduce GitHub Action usage according to ASF project allowance > - > > Key: SPARK-48094 > URL: https://issues.apache.org/jira/browse/SPARK-48094 > Project: Spark > Issue Type: Umbrella > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Priority: Blocker > Fix For: 4.0.0 > > Attachments: Screenshot 2024-05-02 at 23.56.05.png > > > h2. ASF INFRA POLICY > - https://infra.apache.org/github-actions-policy.html > h2. MONITORING > - https://infra-reports.apache.org/#ghactions=spark=168 > !Screenshot 2024-05-02 at 23.56.05.png|width=100%! > h2. TARGET > * All workflows MUST have a job concurrency level less than or equal to 20. > This means a workflow cannot have more than 20 jobs running at the same time > across all matrices. > * All workflows SHOULD have a job concurrency level less than or equal to 15. > Just because 20 is the max, doesn't mean you should strive for 20. > * The average number of minutes a project uses per calendar week MUST NOT > exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 > hours). > * The average number of minutes a project uses in any consecutive five-day > period MUST NOT exceed the equivalent of 30 full-time runners (216,000 > minutes, or 3,600 hours). > h2. DEADLINE > bq. 17th of May, 2024 > Since the deadline is 17th of May, 2024, I set this as the highest priority, > `Blocker`. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48204) fix release script for Spark 4.0+
[ https://issues.apache.org/jira/browse/SPARK-48204?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48204: - Assignee: Wenchen Fan > fix release script for Spark 4.0+ > - > > Key: SPARK-48204 > URL: https://issues.apache.org/jira/browse/SPARK-48204 > Project: Spark > Issue Type: Bug > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Wenchen Fan >Assignee: Wenchen Fan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48204) fix release script for Spark 4.0+
[ https://issues.apache.org/jira/browse/SPARK-48204?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48204. --- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46484 [https://github.com/apache/spark/pull/46484] > fix release script for Spark 4.0+ > - > > Key: SPARK-48204 > URL: https://issues.apache.org/jira/browse/SPARK-48204 > Project: Spark > Issue Type: Bug > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Wenchen Fan >Assignee: Wenchen Fan >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance
[ https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48094. --- Assignee: Dongjoon Hyun Resolution: Fixed > Reduce GitHub Action usage according to ASF project allowance > - > > Key: SPARK-48094 > URL: https://issues.apache.org/jira/browse/SPARK-48094 > Project: Spark > Issue Type: Umbrella > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Blocker > Fix For: 4.0.0 > > Attachments: Screenshot 2024-05-02 at 23.56.05.png > > > h2. ASF INFRA POLICY > - https://infra.apache.org/github-actions-policy.html > h2. MONITORING > - https://infra-reports.apache.org/#ghactions=spark=168 > !Screenshot 2024-05-02 at 23.56.05.png|width=100%! > h2. TARGET > * All workflows MUST have a job concurrency level less than or equal to 20. > This means a workflow cannot have more than 20 jobs running at the same time > across all matrices. > * All workflows SHOULD have a job concurrency level less than or equal to 15. > Just because 20 is the max, doesn't mean you should strive for 20. > * The average number of minutes a project uses per calendar week MUST NOT > exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 > hours). > * The average number of minutes a project uses in any consecutive five-day > period MUST NOT exceed the equivalent of 30 full-time runners (216,000 > minutes, or 3,600 hours). > h2. DEADLINE > bq. 17th of May, 2024 > Since the deadline is 17th of May, 2024, I set this as the highest priority, > `Blocker`. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48207) Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed
[ https://issues.apache.org/jira/browse/SPARK-48207?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun resolved SPARK-48207. --- Fix Version/s: 3.4.4 Resolution: Fixed Issue resolved by pull request 46489 [https://github.com/apache/spark/pull/46489] > Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed > > > Key: SPARK-48207 > URL: https://issues.apache.org/jira/browse/SPARK-48207 > Project: Spark > Issue Type: Sub-task > Components: Project Infra >Affects Versions: 3.4.4 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > Fix For: 3.4.4 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48207) Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed
[ https://issues.apache.org/jira/browse/SPARK-48207?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dongjoon Hyun reassigned SPARK-48207: - Assignee: Dongjoon Hyun > Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed > > > Key: SPARK-48207 > URL: https://issues.apache.org/jira/browse/SPARK-48207 > Project: Spark > Issue Type: Sub-task > Components: Project Infra >Affects Versions: 3.4.4 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org