[jira] [Resolved] (SPARK-48381) Update `YuniKorn` docs with v1.5.1

2024-05-21 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48381?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48381.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46690
[https://github.com/apache/spark/pull/46690]

> Update `YuniKorn` docs with v1.5.1
> --
>
> Key: SPARK-48381
> URL: https://issues.apache.org/jira/browse/SPARK-48381
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, Kubernetes
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48381) Update `YuniKorn` docs with v1.5.1

2024-05-21 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48381?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48381:
-

Assignee: Dongjoon Hyun

> Update `YuniKorn` docs with v1.5.1
> --
>
> Key: SPARK-48381
> URL: https://issues.apache.org/jira/browse/SPARK-48381
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, Kubernetes
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48381) Update `YuniKorn` docs with v1.5.1

2024-05-21 Thread Dongjoon Hyun (Jira)
Dongjoon Hyun created SPARK-48381:
-

 Summary: Update `YuniKorn` docs with v1.5.1
 Key: SPARK-48381
 URL: https://issues.apache.org/jira/browse/SPARK-48381
 Project: Spark
  Issue Type: Sub-task
  Components: Documentation, Kubernetes
Affects Versions: 4.0.0
Reporter: Dongjoon Hyun






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48329) Enable `spark.sql.sources.v2.bucketing.pushPartValues.enabled` by default

2024-05-21 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48329:
--
Summary: Enable `spark.sql.sources.v2.bucketing.pushPartValues.enabled` by 
default  (was: Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to 
true)

> Enable `spark.sql.sources.v2.bucketing.pushPartValues.enabled` by default
> -
>
> Key: SPARK-48329
> URL: https://issues.apache.org/jira/browse/SPARK-48329
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Szehon Ho
>Assignee: Szehon Ho
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> The SPJ feature flag 'spark.sql.sources.v2.bucketing.pushPartValues.enabled' 
> has proven valuable for most use cases.  We should take advantage of 4.0 
> release and change the value to true.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48329) Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to true

2024-05-21 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48329:
--
Parent Issue: SPARK-44111  (was: SPARK-37375)

> Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to true
> -
>
> Key: SPARK-48329
> URL: https://issues.apache.org/jira/browse/SPARK-48329
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Szehon Ho
>Assignee: Szehon Ho
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> The SPJ feature flag 'spark.sql.sources.v2.bucketing.pushPartValues.enabled' 
> has proven valuable for most use cases.  We should take advantage of 4.0 
> release and change the value to true.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48329) Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to true

2024-05-21 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48329.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46673
[https://github.com/apache/spark/pull/46673]

> Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to true
> -
>
> Key: SPARK-48329
> URL: https://issues.apache.org/jira/browse/SPARK-48329
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Szehon Ho
>Assignee: Szehon Ho
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> The SPJ feature flag 'spark.sql.sources.v2.bucketing.pushPartValues.enabled' 
> has proven valuable for most use cases.  We should take advantage of 4.0 
> release and change the value to true.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48329) Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to true

2024-05-21 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48329:
-

Assignee: Szehon Ho

> Default spark.sql.sources.v2.bucketing.pushPartValues.enabled to true
> -
>
> Key: SPARK-48329
> URL: https://issues.apache.org/jira/browse/SPARK-48329
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Szehon Ho
>Assignee: Szehon Ho
>Priority: Minor
>  Labels: pull-request-available
>
> The SPJ feature flag 'spark.sql.sources.v2.bucketing.pushPartValues.enabled' 
> has proven valuable for most use cases.  We should take advantage of 4.0 
> release and change the value to true.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48328) Upgrade `Arrow` to 16.1.0

2024-05-20 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48328?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48328.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46646
[https://github.com/apache/spark/pull/46646]

> Upgrade `Arrow` to 16.1.0
> -
>
> Key: SPARK-48328
> URL: https://issues.apache.org/jira/browse/SPARK-48328
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48017) Add Spark application submission worker for operator

2024-05-20 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48017?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48017:
-

Assignee: Zhou JIANG

> Add Spark application submission worker for operator
> 
>
> Key: SPARK-48017
> URL: https://issues.apache.org/jira/browse/SPARK-48017
> Project: Spark
>  Issue Type: Sub-task
>  Components: k8s
>Affects Versions: kubernetes-operator-0.1.0
>Reporter: Zhou JIANG
>Assignee: Zhou JIANG
>Priority: Major
>  Labels: pull-request-available
>
> Spark Operator needs a submission worker that converts it's application 
> abstraction (Operator API) to k8s resources. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48017) Add Spark application submission worker for operator

2024-05-20 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48017?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48017.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 10
[https://github.com/apache/spark-kubernetes-operator/pull/10]

> Add Spark application submission worker for operator
> 
>
> Key: SPARK-48017
> URL: https://issues.apache.org/jira/browse/SPARK-48017
> Project: Spark
>  Issue Type: Sub-task
>  Components: k8s
>Affects Versions: kubernetes-operator-0.1.0
>Reporter: Zhou JIANG
>Assignee: Zhou JIANG
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Spark Operator needs a submission worker that converts it's application 
> abstraction (Operator API) to k8s resources. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48256) Add a rule to check file headers for the java side, and fix inconsistent files

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48256?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48256.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46557
[https://github.com/apache/spark/pull/46557]

> Add a rule to check file headers for the java side, and fix inconsistent files
> --
>
> Key: SPARK-48256
> URL: https://issues.apache.org/jira/browse/SPARK-48256
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48256) Add a rule to check file headers for the java side, and fix inconsistent files

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48256?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48256:
-

Assignee: BingKun Pan

> Add a rule to check file headers for the java side, and fix inconsistent files
> --
>
> Key: SPARK-48256
> URL: https://issues.apache.org/jira/browse/SPARK-48256
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48218) TransportClientFactory.createClient may NPE cause FetchFailedException

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48218?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48218:
-

Assignee: dzcxzl

> TransportClientFactory.createClient may NPE cause FetchFailedException
> --
>
> Key: SPARK-48218
> URL: https://issues.apache.org/jira/browse/SPARK-48218
> Project: Spark
>  Issue Type: Improvement
>  Components: Shuffle
>Affects Versions: 4.0.0
>Reporter: dzcxzl
>Assignee: dzcxzl
>Priority: Minor
>  Labels: pull-request-available
>
> {code:java}
> org.apache.spark.shuffle.FetchFailedException
>   at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:1180)
>   at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:913)
>   at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:84)
>   at 
> org.apache.spark.util.CompletionIterator.next(CompletionIterator.scala:29)
> Caused by: java.lang.NullPointerException
>   at 
> org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:178)
>   at 
> org.apache.spark.network.shuffle.ExternalBlockStoreClient.lambda$fetchBlocks$0(ExternalBlockStoreClient.java:128)
>   at 
> org.apache.spark.network.shuffle.RetryingBlockTransferor.transferAllOutstanding(RetryingBlockTransferor.java:154)
>   at 
> org.apache.spark.network.shuffle.RetryingBlockTransferor.start(RetryingBlockTransferor.java:133)
>   at 
> org.apache.spark.network.shuffle.ExternalBlockStoreClient.fetchBlocks(ExternalBlockStoreClient.java:139)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48218) TransportClientFactory.createClient may NPE cause FetchFailedException

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48218?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48218.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46506
[https://github.com/apache/spark/pull/46506]

> TransportClientFactory.createClient may NPE cause FetchFailedException
> --
>
> Key: SPARK-48218
> URL: https://issues.apache.org/jira/browse/SPARK-48218
> Project: Spark
>  Issue Type: Improvement
>  Components: Shuffle
>Affects Versions: 4.0.0
>Reporter: dzcxzl
>Assignee: dzcxzl
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> {code:java}
> org.apache.spark.shuffle.FetchFailedException
>   at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:1180)
>   at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:913)
>   at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:84)
>   at 
> org.apache.spark.util.CompletionIterator.next(CompletionIterator.scala:29)
> Caused by: java.lang.NullPointerException
>   at 
> org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:178)
>   at 
> org.apache.spark.network.shuffle.ExternalBlockStoreClient.lambda$fetchBlocks$0(ExternalBlockStoreClient.java:128)
>   at 
> org.apache.spark.network.shuffle.RetryingBlockTransferor.transferAllOutstanding(RetryingBlockTransferor.java:154)
>   at 
> org.apache.spark.network.shuffle.RetryingBlockTransferor.start(RetryingBlockTransferor.java:133)
>   at 
> org.apache.spark.network.shuffle.ExternalBlockStoreClient.fetchBlocks(ExternalBlockStoreClient.java:139)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48049) Upgrade Scala to 2.13.14

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48049?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48049.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46288
[https://github.com/apache/spark/pull/46288]

> Upgrade Scala to 2.13.14
> 
>
> Key: SPARK-48049
> URL: https://issues.apache.org/jira/browse/SPARK-48049
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48285) Update docs for size function and sizeOfNull configuration

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48285?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48285.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46592
[https://github.com/apache/spark/pull/46592]

> Update docs for size function and sizeOfNull configuration
> --
>
> Key: SPARK-48285
> URL: https://issues.apache.org/jira/browse/SPARK-48285
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Kent Yao
>Assignee: Kent Yao
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-48238) Spark fail to start due to class o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter

2024-05-15 Thread Dongjoon Hyun (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-48238?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17846701#comment-17846701
 ] 

Dongjoon Hyun commented on SPARK-48238:
---

Hi, [~chengpan] and [~HF] and [~cloud_fan]. Is this true that we need to revert 
SPARK-45522 and SPARK-47118 for only YARN support?
Do you think there is an alternative like we did for Hadoop 2 and Hadoop 3 
support or Hive 1 and Hive 2 support?
For example, can we isolate Jetty issues to YARN module and JettyUtil via 
configurations?

> Spark fail to start due to class 
> o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter
> ---
>
> Key: SPARK-48238
> URL: https://issues.apache.org/jira/browse/SPARK-48238
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Priority: Blocker
>
> I tested the latest master branch, it failed to start on YARN mode
> {code:java}
> dev/make-distribution.sh --tgz -Phive,hive-thriftserver,yarn{code}
>  
> {code:java}
> $ bin/spark-sql --master yarn
> WARNING: Using incubator modules: jdk.incubator.vector
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
> setLogLevel(newLevel).
> 2024-05-10 17:58:17 WARN NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2024-05-10 17:58:18 WARN Client: Neither spark.yarn.jars nor 
> spark.yarn.archive} is set, falling back to uploading libraries under 
> SPARK_HOME.
> 2024-05-10 17:58:25 ERROR SparkContext: Error initializing SparkContext.
> org.sparkproject.jetty.util.MultiException: Multiple exceptions
>     at 
> org.sparkproject.jetty.util.MultiException.ifExceptionThrow(MultiException.java:117)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:751)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:392)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:902)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:306)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:93)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:514) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:81) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:81)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) 
> ~[scala-library-2.13.13.jar:?]
>     at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) 
> ~[scala-library-2.13.13.jar:?]
>     at scala.collection.AbstractIterable.foreach(Iterable.scala:935) 
> ~[scala-library-2.13.13.jar:?]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:81) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:79)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?]
>     at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:79) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at org.apache.spark.SparkContext.$anonfun$new$31(SparkContext.scala:690) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.SparkContext.$anonfun$new$31$adapted(SparkContext.scala:690) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?]
>     at org.apache.spark.SparkContext.(SparkContext.scala:690) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2963) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:1118)
>  ~[spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.Option.getOrElse(Option.scala:201) [scala-library-2.13.13.jar:?]
>     at 
> 

[jira] [Updated] (SPARK-48238) Spark fail to start due to class o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48238?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48238:
--
Description: 
I tested the latest master branch, it failed to start on YARN mode
{code:java}
dev/make-distribution.sh --tgz -Phive,hive-thriftserver,yarn{code}
 
{code:java}
$ bin/spark-sql --master yarn
WARNING: Using incubator modules: jdk.incubator.vector
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
2024-05-10 17:58:17 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
2024-05-10 17:58:18 WARN Client: Neither spark.yarn.jars nor 
spark.yarn.archive} is set, falling back to uploading libraries under 
SPARK_HOME.
2024-05-10 17:58:25 ERROR SparkContext: Error initializing SparkContext.
org.sparkproject.jetty.util.MultiException: Multiple exceptions
    at 
org.sparkproject.jetty.util.MultiException.ifExceptionThrow(MultiException.java:117)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:751)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:392)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:902)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:306)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:93)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:514) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:81) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:81)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) 
~[scala-library-2.13.13.jar:?]
    at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) 
~[scala-library-2.13.13.jar:?]
    at scala.collection.AbstractIterable.foreach(Iterable.scala:935) 
~[scala-library-2.13.13.jar:?]
    at 
org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:81) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:79)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?]
    at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:79) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at org.apache.spark.SparkContext.$anonfun$new$31(SparkContext.scala:690) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.SparkContext.$anonfun$new$31$adapted(SparkContext.scala:690) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?]
    at org.apache.spark.SparkContext.(SparkContext.scala:690) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2963) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:1118)
 ~[spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at scala.Option.getOrElse(Option.scala:201) [scala-library-2.13.13.jar:?]
    at 
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:1112) 
[spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:64) 
[spark-hive-thriftserver_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.(SparkSQLCLIDriver.scala:405)
 [spark-hive-thriftserver_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:162)
 [spark-hive-thriftserver_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
 [spark-hive-thriftserver_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method) ~[?:?]
    at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
 ~[?:?]
    at 

[jira] [Resolved] (SPARK-48279) Upgrade ORC to 2.0.1

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48279?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48279.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46587
[https://github.com/apache/spark/pull/46587]

> Upgrade ORC to 2.0.1
> 
>
> Key: SPARK-48279
> URL: https://issues.apache.org/jira/browse/SPARK-48279
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: William Hyun
>Assignee: William Hyun
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48231) Remove unused CodeHaus Jackson dependencies

2024-05-13 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48231?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48231:
--
Parent: (was: SPARK-47046)
Issue Type: Bug  (was: Sub-task)

> Remove unused CodeHaus Jackson dependencies
> ---
>
> Key: SPARK-48231
> URL: https://issues.apache.org/jira/browse/SPARK-48231
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48230) Remove unused jodd-core

2024-05-13 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48230:
--
Parent: (was: SPARK-47046)
Issue Type: Bug  (was: Sub-task)

> Remove unused jodd-core
> ---
>
> Key: SPARK-48230
> URL: https://issues.apache.org/jira/browse/SPARK-48230
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48237) After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be deleted

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48237?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48237:
--
Issue Type: Bug  (was: Improvement)

> After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be 
> deleted 
> 
>
> Key: SPARK-48237
> URL: https://issues.apache.org/jira/browse/SPARK-48237
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2, 3.4.4
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48237) After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be deleted

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48237?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48237:
-

Assignee: BingKun Pan

> After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be 
> deleted 
> 
>
> Key: SPARK-48237
> URL: https://issues.apache.org/jira/browse/SPARK-48237
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48237) After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be deleted

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48237?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48237.
---
Fix Version/s: 3.4.4
   3.5.2
   4.0.0
   Resolution: Fixed

Issue resolved by pull request 46531
[https://github.com/apache/spark/pull/46531]

> After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be 
> deleted 
> 
>
> Key: SPARK-48237
> URL: https://issues.apache.org/jira/browse/SPARK-48237
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 3.4.4, 3.5.2, 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48236) Add `commons-lang:commons-lang:2.6` back to support legacy Hive UDF jars

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48236?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48236.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46528
[https://github.com/apache/spark/pull/46528]

> Add `commons-lang:commons-lang:2.6` back to support legacy Hive UDF jars
> 
>
> Key: SPARK-48236
> URL: https://issues.apache.org/jira/browse/SPARK-48236
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-48230) Remove unused jodd-core

2024-05-10 Thread Dongjoon Hyun (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17845522#comment-17845522
 ] 

Dongjoon Hyun commented on SPARK-48230:
---

We will revisit this dependency.

> Remove unused jodd-core
> ---
>
> Key: SPARK-48230
> URL: https://issues.apache.org/jira/browse/SPARK-48230
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48230) Remove unused jodd-core

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48230:
--
Fix Version/s: (was: 4.0.0)

> Remove unused jodd-core
> ---
>
> Key: SPARK-48230
> URL: https://issues.apache.org/jira/browse/SPARK-48230
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (ORC-1714) Bump commons-csv to 1.11.0

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/ORC-1714?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated ORC-1714:
---
Fix Version/s: 2.0.1

> Bump commons-csv to 1.11.0
> --
>
> Key: ORC-1714
> URL: https://issues.apache.org/jira/browse/ORC-1714
> Project: ORC
>  Issue Type: Bug
>  Components: Java
>Affects Versions: 2.1.0
>Reporter: William Hyun
>Assignee: William Hyun
>Priority: Minor
> Fix For: 2.1.0, 2.0.1
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (ORC-1661) [C++] Better handling when TZDB is unavailable

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/ORC-1661?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated ORC-1661:
---
Fix Version/s: (was: 2.0.1)

> [C++] Better handling when TZDB is unavailable
> --
>
> Key: ORC-1661
> URL: https://issues.apache.org/jira/browse/ORC-1661
> Project: ORC
>  Issue Type: Improvement
>  Components: C++
>Reporter: Gang Wu
>Assignee: Gang Wu
>Priority: Major
> Fix For: 2.1.0
>
>
> When /usr/share/zoneinfo is unavailable and TZDIR env is unset, creating C++ 
> ORC reader will crash on Windows. We need to better deal with this case. See 
> context from the Apache Arrow community: 
> [https://github.com/apache/arrow/issues/36026] and 
> [https://github.com/apache/arrow/issues/40633]
>  
> We could perhaps do following things:
>  * Make sure it does not crash when TZDB is missing (on Windows). The 
> prerequisite is to enable running C++ unit tests on Windows.
>  * More TZDB search locations. (e.g. $CONDA_PREFIX/usr/share/zoneinfo)
>  * Do not eagerly reading TZDB when timestamp types are not required.
>  * Add a runtime config to set TZDB location.
>  * Ship TZDB with the ORC library while building on Windows?



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (SPARK-48144) canPlanAsBroadcastHashJoin should respect shuffle join hints

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48144?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48144.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46401
[https://github.com/apache/spark/pull/46401]

> canPlanAsBroadcastHashJoin should respect shuffle join hints
> 
>
> Key: SPARK-48144
> URL: https://issues.apache.org/jira/browse/SPARK-48144
> Project: Spark
>  Issue Type: Bug
>  Components: Optimizer
>Affects Versions: 4.0.0, 3.5.2, 3.4.4
>Reporter: Fredrik Klauß
>Assignee: Fredrik Klauß
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Currently, `canPlanAsBroadcastHashJoin` incorrectly returns that a join can 
> be planned as a BHJ, even though the join contains a SHJ.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48144) canPlanAsBroadcastHashJoin should respect shuffle join hints

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48144?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48144:
-

Assignee: Fredrik Klauß

> canPlanAsBroadcastHashJoin should respect shuffle join hints
> 
>
> Key: SPARK-48144
> URL: https://issues.apache.org/jira/browse/SPARK-48144
> Project: Spark
>  Issue Type: Bug
>  Components: Optimizer
>Affects Versions: 4.0.0, 3.5.2, 3.4.4
>Reporter: Fredrik Klauß
>Assignee: Fredrik Klauß
>Priority: Major
>  Labels: pull-request-available
>
> Currently, `canPlanAsBroadcastHashJoin` incorrectly returns that a join can 
> be planned as a BHJ, even though the join contains a SHJ.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47441) Do not add log link for unmanaged AM in Spark UI

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47441?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-47441:
-

Assignee: Yuming Wang

> Do not add log link for unmanaged AM in Spark UI
> 
>
> Key: SPARK-47441
> URL: https://issues.apache.org/jira/browse/SPARK-47441
> Project: Spark
>  Issue Type: Bug
>  Components: YARN
>Affects Versions: 3.5.0, 3.5.1
>Reporter: Yuming Wang
>Assignee: Yuming Wang
>Priority: Major
>  Labels: pull-request-available
>
> {noformat}
> 24/03/18 04:58:25,022 ERROR [spark-listener-group-appStatus] 
> scheduler.AsyncEventQueue:97 : Listener AppStatusListener threw an exception
> java.lang.NumberFormatException: For input string: "null"
>   at 
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:67) 
> ~[?:?]
>   at java.lang.Integer.parseInt(Integer.java:668) ~[?:?]
>   at java.lang.Integer.parseInt(Integer.java:786) ~[?:?]
>   at scala.collection.immutable.StringLike.toInt(StringLike.scala:310) 
> ~[scala-library-2.12.18.jar:?]
>   at scala.collection.immutable.StringLike.toInt$(StringLike.scala:310) 
> ~[scala-library-2.12.18.jar:?]
>   at scala.collection.immutable.StringOps.toInt(StringOps.scala:33) 
> ~[scala-library-2.12.18.jar:?]
>   at org.apache.spark.util.Utils$.parseHostPort(Utils.scala:1105) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.ProcessSummaryWrapper.(storeTypes.scala:609) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.LiveMiscellaneousProcess.doUpdate(LiveEntity.scala:1045)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.status.LiveEntity.write(LiveEntity.scala:50) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.AppStatusListener.update(AppStatusListener.scala:1233)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.AppStatusListener.onMiscellaneousProcessAdded(AppStatusListener.scala:1445)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.AppStatusListener.onOtherEvent(AppStatusListener.scala:113)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:100)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) 
> ~[scala-library-2.12.18.jar:?]
>   at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) 
> ~[scala-library-2.12.18.jar:?]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1356) 
> [spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
>  [spark-core_2.12-3.5.1.jar:3.5.1]
> {noformat}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47441) Do not add log link for unmanaged AM in Spark UI

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47441?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-47441.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45565
[https://github.com/apache/spark/pull/45565]

> Do not add log link for unmanaged AM in Spark UI
> 
>
> Key: SPARK-47441
> URL: https://issues.apache.org/jira/browse/SPARK-47441
> Project: Spark
>  Issue Type: Bug
>  Components: YARN
>Affects Versions: 3.5.0, 3.5.1
>Reporter: Yuming Wang
>Assignee: Yuming Wang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> {noformat}
> 24/03/18 04:58:25,022 ERROR [spark-listener-group-appStatus] 
> scheduler.AsyncEventQueue:97 : Listener AppStatusListener threw an exception
> java.lang.NumberFormatException: For input string: "null"
>   at 
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:67) 
> ~[?:?]
>   at java.lang.Integer.parseInt(Integer.java:668) ~[?:?]
>   at java.lang.Integer.parseInt(Integer.java:786) ~[?:?]
>   at scala.collection.immutable.StringLike.toInt(StringLike.scala:310) 
> ~[scala-library-2.12.18.jar:?]
>   at scala.collection.immutable.StringLike.toInt$(StringLike.scala:310) 
> ~[scala-library-2.12.18.jar:?]
>   at scala.collection.immutable.StringOps.toInt(StringOps.scala:33) 
> ~[scala-library-2.12.18.jar:?]
>   at org.apache.spark.util.Utils$.parseHostPort(Utils.scala:1105) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.ProcessSummaryWrapper.(storeTypes.scala:609) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.LiveMiscellaneousProcess.doUpdate(LiveEntity.scala:1045)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.status.LiveEntity.write(LiveEntity.scala:50) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.AppStatusListener.update(AppStatusListener.scala:1233)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.AppStatusListener.onMiscellaneousProcessAdded(AppStatusListener.scala:1445)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.AppStatusListener.onOtherEvent(AppStatusListener.scala:113)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:100)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) 
> ~[scala-library-2.12.18.jar:?]
>   at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) 
> ~[scala-library-2.12.18.jar:?]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1356) 
> [spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
>  [spark-core_2.12-3.5.1.jar:3.5.1]
> {noformat}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48235:
-

Assignee: Fredrik Klauß  (was: Dongjoon Hyun)

> Directly pass join instead of all arguments to getBroadcastBuildSide and 
> getShuffleHashJoinBuildSide
> 
>
> Key: SPARK-48235
> URL: https://issues.apache.org/jira/browse/SPARK-48235
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Fredrik Klauß
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48235:
--
Reporter: Fredrik Klauß  (was: Dongjoon Hyun)

> Directly pass join instead of all arguments to getBroadcastBuildSide and 
> getShuffleHashJoinBuildSide
> 
>
> Key: SPARK-48235
> URL: https://issues.apache.org/jira/browse/SPARK-48235
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Fredrik Klauß
>Assignee: Fredrik Klauß
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48235.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46525
[https://github.com/apache/spark/pull/46525]

> Directly pass join instead of all arguments to getBroadcastBuildSide and 
> getShuffleHashJoinBuildSide
> 
>
> Key: SPARK-48235
> URL: https://issues.apache.org/jira/browse/SPARK-48235
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48235:
-

Assignee: Dongjoon Hyun

> Directly pass join instead of all arguments to getBroadcastBuildSide and 
> getShuffleHashJoinBuildSide
> 
>
> Key: SPARK-48235
> URL: https://issues.apache.org/jira/browse/SPARK-48235
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide

2024-05-10 Thread Dongjoon Hyun (Jira)
Dongjoon Hyun created SPARK-48235:
-

 Summary: Directly pass join instead of all arguments to 
getBroadcastBuildSide and getShuffleHashJoinBuildSide
 Key: SPARK-48235
 URL: https://issues.apache.org/jira/browse/SPARK-48235
 Project: Spark
  Issue Type: Improvement
  Components: SQL
Affects Versions: 4.0.0
Reporter: Dongjoon Hyun






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48230) Remove unused jodd-core

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48230.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46520
[https://github.com/apache/spark/pull/46520]

> Remove unused jodd-core
> ---
>
> Key: SPARK-48230
> URL: https://issues.apache.org/jira/browse/SPARK-48230
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48231) Remove unused CodeHaus Jackson dependencies

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48231?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48231:
-

Assignee: Cheng Pan

> Remove unused CodeHaus Jackson dependencies
> ---
>
> Key: SPARK-48231
> URL: https://issues.apache.org/jira/browse/SPARK-48231
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48230) Remove unused jodd-core

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48230:
-

Assignee: Cheng Pan

> Remove unused jodd-core
> ---
>
> Key: SPARK-48230
> URL: https://issues.apache.org/jira/browse/SPARK-48230
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48231) Remove unused CodeHaus Jackson dependencies

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48231?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48231:
--
Parent: SPARK-47046
Issue Type: Sub-task  (was: Improvement)

> Remove unused CodeHaus Jackson dependencies
> ---
>
> Key: SPARK-48231
> URL: https://issues.apache.org/jira/browse/SPARK-48231
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47847) Deprecate spark.network.remoteReadNioBufferConversion

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47847?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-47847.
---
Fix Version/s: 3.5.2
   4.0.0
   Resolution: Fixed

Issue resolved by pull request 46047
[https://github.com/apache/spark/pull/46047]

> Deprecate spark.network.remoteReadNioBufferConversion
> -
>
> Key: SPARK-47847
> URL: https://issues.apache.org/jira/browse/SPARK-47847
> Project: Spark
>  Issue Type: Improvement
>  Components: Shuffle, Spark Core
>Affects Versions: 3.5.2
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.5.2, 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47847) Deprecate spark.network.remoteReadNioBufferConversion

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47847?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-47847:
-

Assignee: Cheng Pan

> Deprecate spark.network.remoteReadNioBufferConversion
> -
>
> Key: SPARK-47847
> URL: https://issues.apache.org/jira/browse/SPARK-47847
> Project: Spark
>  Issue Type: Improvement
>  Components: Shuffle, Spark Core
>Affects Versions: 3.5.2
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47847) Deprecate spark.network.remoteReadNioBufferConversion

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47847?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-47847:
--
Parent: SPARK-44111
Issue Type: Sub-task  (was: Improvement)

> Deprecate spark.network.remoteReadNioBufferConversion
> -
>
> Key: SPARK-47847
> URL: https://issues.apache.org/jira/browse/SPARK-47847
> Project: Spark
>  Issue Type: Sub-task
>  Components: Shuffle, Spark Core
>Affects Versions: 3.5.2
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48230) Remove unused jodd-core

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48230:
--
Parent: SPARK-47046
Issue Type: Sub-task  (was: Improvement)

> Remove unused jodd-core
> ---
>
> Key: SPARK-48230
> URL: https://issues.apache.org/jira/browse/SPARK-48230
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48094:
--
Description: 
h2. ASF INFRA POLICY
 - [https://infra.apache.org/github-actions-policy.html]

h2. MONITORING
 - [https://infra-reports.apache.org/#ghactions=spark=168]

!Screenshot 2024-05-02 at 23.56.05.png|width=100%!

h2. TARGET
 * All workflows MUST have a job concurrency level less than or equal to 20. 
This means a workflow cannot have more than 20 jobs running at the same time 
across all matrices.
 * All workflows SHOULD have a job concurrency level less than or equal to 15. 
Just because 20 is the max, doesn't mean you should strive for 20.
 * The average number of minutes a project uses per calendar week MUST NOT 
exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 hours).
 * The average number of minutes a project uses in any consecutive five-day 
period MUST NOT exceed the equivalent of 30 full-time runners (216,000 minutes, 
or 3,600 hours).

h2. DEADLINE
{quote}17th of May, 2024
{quote}

  was:
h2. ASF INFRA POLICY
 - [https://infra.apache.org/github-actions-policy.html]

h2. MONITORING
 - [https://infra-reports.apache.org/#ghactions=spark=168]

!Screenshot 2024-05-02 at 23.56.05.png|width=100!
h2. TARGET
 * All workflows MUST have a job concurrency level less than or equal to 20. 
This means a workflow cannot have more than 20 jobs running at the same time 
across all matrices.
 * All workflows SHOULD have a job concurrency level less than or equal to 15. 
Just because 20 is the max, doesn't mean you should strive for 20.
 * The average number of minutes a project uses per calendar week MUST NOT 
exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 hours).
 * The average number of minutes a project uses in any consecutive five-day 
period MUST NOT exceed the equivalent of 30 full-time runners (216,000 minutes, 
or 3,600 hours).

h2. DEADLINE
{quote}17th of May, 2024
{quote}


> Reduce GitHub Action usage according to ASF project allowance
> -
>
> Key: SPARK-48094
> URL: https://issues.apache.org/jira/browse/SPARK-48094
> Project: Spark
>  Issue Type: Umbrella
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Priority: Major
> Attachments: Screenshot 2024-05-02 at 23.56.05.png
>
>
> h2. ASF INFRA POLICY
>  - [https://infra.apache.org/github-actions-policy.html]
> h2. MONITORING
>  - [https://infra-reports.apache.org/#ghactions=spark=168]
> !Screenshot 2024-05-02 at 23.56.05.png|width=100%!
> h2. TARGET
>  * All workflows MUST have a job concurrency level less than or equal to 20. 
> This means a workflow cannot have more than 20 jobs running at the same time 
> across all matrices.
>  * All workflows SHOULD have a job concurrency level less than or equal to 
> 15. Just because 20 is the max, doesn't mean you should strive for 20.
>  * The average number of minutes a project uses per calendar week MUST NOT 
> exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 
> hours).
>  * The average number of minutes a project uses in any consecutive five-day 
> period MUST NOT exceed the equivalent of 30 full-time runners (216,000 
> minutes, or 3,600 hours).
> h2. DEADLINE
> {quote}17th of May, 2024
> {quote}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48201) Docstrings of the pyspark DataStream Reader methods are inaccurate

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48201?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48201.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46416
[https://github.com/apache/spark/pull/46416]

> Docstrings of the pyspark DataStream Reader methods are inaccurate
> --
>
> Key: SPARK-48201
> URL: https://issues.apache.org/jira/browse/SPARK-48201
> Project: Spark
>  Issue Type: Documentation
>  Components: PySpark
>Affects Versions: 3.4.3
>Reporter: Chloe He
>Assignee: Chloe He
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> The docstrings of the pyspark DataStream Reader methods {{csv()}} and 
> {{text()}} say that the {{path}} parameter can be a list, but actually when a 
> list is passed an error is raised.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48201) Docstrings of the pyspark DataStream Reader methods are inaccurate

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48201?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48201:
-

Assignee: Chloe He

> Docstrings of the pyspark DataStream Reader methods are inaccurate
> --
>
> Key: SPARK-48201
> URL: https://issues.apache.org/jira/browse/SPARK-48201
> Project: Spark
>  Issue Type: Documentation
>  Components: PySpark
>Affects Versions: 3.4.3
>Reporter: Chloe He
>Assignee: Chloe He
>Priority: Minor
>  Labels: pull-request-available
>
> The docstrings of the pyspark DataStream Reader methods {{csv()}} and 
> {{text()}} say that the {{path}} parameter can be a list, but actually when a 
> list is passed an error is raised.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48228) Implement the missing function validation in ApplyInXXX

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48228?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48228.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46519
[https://github.com/apache/spark/pull/46519]

> Implement the missing function validation in ApplyInXXX
> ---
>
> Key: SPARK-48228
> URL: https://issues.apache.org/jira/browse/SPARK-48228
> Project: Spark
>  Issue Type: Sub-task
>  Components: Connect, PySpark
>Affects Versions: 4.0.0
>Reporter: Ruifeng Zheng
>Assignee: Ruifeng Zheng
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48228) Implement the missing function validation in ApplyInXXX

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48228?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48228:
-

Assignee: Ruifeng Zheng

> Implement the missing function validation in ApplyInXXX
> ---
>
> Key: SPARK-48228
> URL: https://issues.apache.org/jira/browse/SPARK-48228
> Project: Spark
>  Issue Type: Sub-task
>  Components: Connect, PySpark
>Affects Versions: 4.0.0
>Reporter: Ruifeng Zheng
>Assignee: Ruifeng Zheng
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48224) Disable variant from being a part of a map key

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48224?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48224.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46516
[https://github.com/apache/spark/pull/46516]

> Disable variant from being a part of a map key
> --
>
> Key: SPARK-48224
> URL: https://issues.apache.org/jira/browse/SPARK-48224
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Harsh Motwani
>Assignee: Harsh Motwani
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Creating a map object with a variant key currently works. However, this 
> behavior should be disabled.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48224) Disable variant from being a part of a map key

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48224?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48224:
-

Assignee: Harsh Motwani

> Disable variant from being a part of a map key
> --
>
> Key: SPARK-48224
> URL: https://issues.apache.org/jira/browse/SPARK-48224
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Harsh Motwani
>Assignee: Harsh Motwani
>Priority: Major
>  Labels: pull-request-available
>
> Creating a map object with a variant key currently works. However, this 
> behavior should be disabled.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48163) Disable `SparkConnectServiceSuite.SPARK-43923: commands send events - get_resources_command`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48163?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48163:
--
Parent: (was: SPARK-44111)
Issue Type: Bug  (was: Sub-task)

> Disable `SparkConnectServiceSuite.SPARK-43923: commands send events - 
> get_resources_command`
> 
>
> Key: SPARK-48163
> URL: https://issues.apache.org/jira/browse/SPARK-48163
> Project: Spark
>  Issue Type: Bug
>  Components: SQL, Tests
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>
> {code}
> - SPARK-43923: commands send events ((get_resources_command {
> [info] }
> [info] ,None)) *** FAILED *** (35 milliseconds)
> [info]   VerifyEvents.this.listener.executeHolder.isDefined was false 
> (SparkConnectServiceSuite.scala:873)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-37626) Upgrade libthrift to 0.15.0

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-37626?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-37626.
-

> Upgrade libthrift to 0.15.0
> ---
>
> Key: SPARK-37626
> URL: https://issues.apache.org/jira/browse/SPARK-37626
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 3.3.0
>Reporter: Bo Zhang
>Priority: Major
>
> Upgrade libthrift to 1.15.0 in order to avoid 
> https://nvd.nist.gov/vuln/detail/CVE-2020-13949.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47018) Upgrade built-in Hive to 2.3.10

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47018?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-47018.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46468
[https://github.com/apache/spark/pull/46468]

> Upgrade built-in Hive to 2.3.10
> ---
>
> Key: SPARK-47018
> URL: https://issues.apache.org/jira/browse/SPARK-47018
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build, SQL
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47834) Mark deprecated functions with `@deprecated` in `SQLImplicits`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47834?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-47834:
--
Parent: SPARK-44111
Issue Type: Sub-task  (was: Improvement)

> Mark deprecated functions with `@deprecated` in `SQLImplicits`
> --
>
> Key: SPARK-47834
> URL: https://issues.apache.org/jira/browse/SPARK-47834
> Project: Spark
>  Issue Type: Sub-task
>  Components: Connect, SQL
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47834) Mark deprecated functions with `@deprecated` in `SQLImplicits`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47834?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-47834.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46029
[https://github.com/apache/spark/pull/46029]

> Mark deprecated functions with `@deprecated` in `SQLImplicits`
> --
>
> Key: SPARK-47834
> URL: https://issues.apache.org/jira/browse/SPARK-47834
> Project: Spark
>  Issue Type: Improvement
>  Components: Connect, SQL
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48227) Document the requirement of seed in protos

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48227?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48227.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46518
[https://github.com/apache/spark/pull/46518]

> Document the requirement of seed in protos
> --
>
> Key: SPARK-48227
> URL: https://issues.apache.org/jira/browse/SPARK-48227
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Ruifeng Zheng
>Assignee: Ruifeng Zheng
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48227) Document the requirement of seed in protos

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48227?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48227:
-

Assignee: Ruifeng Zheng

> Document the requirement of seed in protos
> --
>
> Key: SPARK-48227
> URL: https://issues.apache.org/jira/browse/SPARK-48227
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Ruifeng Zheng
>Assignee: Ruifeng Zheng
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48226) Add `spark-ganglia-lgpl` to `lint-java` & `spark-ganglia-lgpl` and `jvm-profiler` to `sbt-checkstyle`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48226?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48226.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46501
[https://github.com/apache/spark/pull/46501]

> Add `spark-ganglia-lgpl` to `lint-java` & `spark-ganglia-lgpl` and 
> `jvm-profiler` to `sbt-checkstyle`
> -
>
> Key: SPARK-48226
> URL: https://issues.apache.org/jira/browse/SPARK-48226
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48226) Add `spark-ganglia-lgpl` to `lint-java` & `spark-ganglia-lgpl` and `jvm-profiler` to `sbt-checkstyle`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48226?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48226:
-

Assignee: BingKun Pan

> Add `spark-ganglia-lgpl` to `lint-java` & `spark-ganglia-lgpl` and 
> `jvm-profiler` to `sbt-checkstyle`
> -
>
> Key: SPARK-48226
> URL: https://issues.apache.org/jira/browse/SPARK-48226
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48225) Upgrade `sbt` to 1.10.0

2024-05-09 Thread Dongjoon Hyun (Jira)
Dongjoon Hyun created SPARK-48225:
-

 Summary: Upgrade `sbt` to 1.10.0
 Key: SPARK-48225
 URL: https://issues.apache.org/jira/browse/SPARK-48225
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Dongjoon Hyun






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-48164) Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - get_resources_command`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48164?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-48164.
-

> Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - 
> get_resources_command`
> --
>
> Key: SPARK-48164
> URL: https://issues.apache.org/jira/browse/SPARK-48164
> Project: Spark
>  Issue Type: Bug
>  Components: Connect, Tests
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48164) Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - get_resources_command`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48164?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48164:
--
Parent: (was: SPARK-44111)
Issue Type: Bug  (was: Sub-task)

> Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - 
> get_resources_command`
> --
>
> Key: SPARK-48164
> URL: https://issues.apache.org/jira/browse/SPARK-48164
> Project: Spark
>  Issue Type: Bug
>  Components: Connect, Tests
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48164) Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - get_resources_command`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48164?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48164:
--
Priority: Major  (was: Blocker)

> Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - 
> get_resources_command`
> --
>
> Key: SPARK-48164
> URL: https://issues.apache.org/jira/browse/SPARK-48164
> Project: Spark
>  Issue Type: Sub-task
>  Components: Connect, Tests
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47930) Upgrade RoaringBitmap to 1.0.6

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47930?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-47930:
--
Parent: SPARK-47046
Issue Type: Sub-task  (was: Improvement)

> Upgrade RoaringBitmap to 1.0.6
> --
>
> Key: SPARK-47930
> URL: https://issues.apache.org/jira/browse/SPARK-47930
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47982) Update code style' plugins to latest version

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47982?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-47982:
--
Parent: SPARK-47046
Issue Type: Sub-task  (was: Improvement)

> Update code style' plugins to latest version
> 
>
> Key: SPARK-47982
> URL: https://issues.apache.org/jira/browse/SPARK-47982
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48216) Remove overrides DockerJDBCIntegrationSuite.connectionTimeout to make related tests configurable

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48216?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48216.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46505
[https://github.com/apache/spark/pull/46505]

> Remove overrides DockerJDBCIntegrationSuite.connectionTimeout to make related 
> tests configurable
> 
>
> Key: SPARK-48216
> URL: https://issues.apache.org/jira/browse/SPARK-48216
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Docker, Tests
>Affects Versions: 4.0.0
>Reporter: Kent Yao
>Assignee: Kent Yao
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48216) Remove overrides DockerJDBCIntegrationSuite.connectionTimeout to make related tests configurable

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48216?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48216:
-

Assignee: Kent Yao

> Remove overrides DockerJDBCIntegrationSuite.connectionTimeout to make related 
> tests configurable
> 
>
> Key: SPARK-48216
> URL: https://issues.apache.org/jira/browse/SPARK-48216
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Docker, Tests
>Affects Versions: 4.0.0
>Reporter: Kent Yao
>Assignee: Kent Yao
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48094:
--
Description: 
h2. ASF INFRA POLICY
 - [https://infra.apache.org/github-actions-policy.html]

h2. MONITORING
 - [https://infra-reports.apache.org/#ghactions=spark=168]

!Screenshot 2024-05-02 at 23.56.05.png|width=100!
h2. TARGET
 * All workflows MUST have a job concurrency level less than or equal to 20. 
This means a workflow cannot have more than 20 jobs running at the same time 
across all matrices.
 * All workflows SHOULD have a job concurrency level less than or equal to 15. 
Just because 20 is the max, doesn't mean you should strive for 20.
 * The average number of minutes a project uses per calendar week MUST NOT 
exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 hours).
 * The average number of minutes a project uses in any consecutive five-day 
period MUST NOT exceed the equivalent of 30 full-time runners (216,000 minutes, 
or 3,600 hours).

h2. DEADLINE
{quote}17th of May, 2024
{quote}

  was:
h2. ASF INFRA POLICY
- https://infra.apache.org/github-actions-policy.html

h2. MONITORING
- https://infra-reports.apache.org/#ghactions=spark=168

 !Screenshot 2024-05-02 at 23.56.05.png|width=100%! 

h2. TARGET
* All workflows MUST have a job concurrency level less than or equal to 20. 
This means a workflow cannot have more than 20 jobs running at the same time 
across all matrices.
* All workflows SHOULD have a job concurrency level less than or equal to 15. 
Just because 20 is the max, doesn't mean you should strive for 20.
* The average number of minutes a project uses per calendar week MUST NOT 
exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 hours).
* The average number of minutes a project uses in any consecutive five-day 
period MUST NOT exceed the equivalent of 30 full-time runners (216,000 minutes, 
or 3,600 hours).

h2. DEADLINE
bq. 17th of May, 2024

Since the deadline is 17th of May, 2024, I set this as the highest priority, 
`Blocker`.




> Reduce GitHub Action usage according to ASF project allowance
> -
>
> Key: SPARK-48094
> URL: https://issues.apache.org/jira/browse/SPARK-48094
> Project: Spark
>  Issue Type: Umbrella
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Priority: Major
> Attachments: Screenshot 2024-05-02 at 23.56.05.png
>
>
> h2. ASF INFRA POLICY
>  - [https://infra.apache.org/github-actions-policy.html]
> h2. MONITORING
>  - [https://infra-reports.apache.org/#ghactions=spark=168]
> !Screenshot 2024-05-02 at 23.56.05.png|width=100!
> h2. TARGET
>  * All workflows MUST have a job concurrency level less than or equal to 20. 
> This means a workflow cannot have more than 20 jobs running at the same time 
> across all matrices.
>  * All workflows SHOULD have a job concurrency level less than or equal to 
> 15. Just because 20 is the max, doesn't mean you should strive for 20.
>  * The average number of minutes a project uses per calendar week MUST NOT 
> exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 
> hours).
>  * The average number of minutes a project uses in any consecutive five-day 
> period MUST NOT exceed the equivalent of 30 full-time runners (216,000 
> minutes, or 3,600 hours).
> h2. DEADLINE
> {quote}17th of May, 2024
> {quote}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48094:
--
Priority: Major  (was: Blocker)

> Reduce GitHub Action usage according to ASF project allowance
> -
>
> Key: SPARK-48094
> URL: https://issues.apache.org/jira/browse/SPARK-48094
> Project: Spark
>  Issue Type: Umbrella
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Priority: Major
> Attachments: Screenshot 2024-05-02 at 23.56.05.png
>
>
> h2. ASF INFRA POLICY
> - https://infra.apache.org/github-actions-policy.html
> h2. MONITORING
> - https://infra-reports.apache.org/#ghactions=spark=168
>  !Screenshot 2024-05-02 at 23.56.05.png|width=100%! 
> h2. TARGET
> * All workflows MUST have a job concurrency level less than or equal to 20. 
> This means a workflow cannot have more than 20 jobs running at the same time 
> across all matrices.
> * All workflows SHOULD have a job concurrency level less than or equal to 15. 
> Just because 20 is the max, doesn't mean you should strive for 20.
> * The average number of minutes a project uses per calendar week MUST NOT 
> exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 
> hours).
> * The average number of minutes a project uses in any consecutive five-day 
> period MUST NOT exceed the equivalent of 30 full-time runners (216,000 
> minutes, or 3,600 hours).
> h2. DEADLINE
> bq. 17th of May, 2024
> Since the deadline is 17th of May, 2024, I set this as the highest priority, 
> `Blocker`.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48094:
--
Fix Version/s: (was: 4.0.0)

> Reduce GitHub Action usage according to ASF project allowance
> -
>
> Key: SPARK-48094
> URL: https://issues.apache.org/jira/browse/SPARK-48094
> Project: Spark
>  Issue Type: Umbrella
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Priority: Blocker
> Attachments: Screenshot 2024-05-02 at 23.56.05.png
>
>
> h2. ASF INFRA POLICY
> - https://infra.apache.org/github-actions-policy.html
> h2. MONITORING
> - https://infra-reports.apache.org/#ghactions=spark=168
>  !Screenshot 2024-05-02 at 23.56.05.png|width=100%! 
> h2. TARGET
> * All workflows MUST have a job concurrency level less than or equal to 20. 
> This means a workflow cannot have more than 20 jobs running at the same time 
> across all matrices.
> * All workflows SHOULD have a job concurrency level less than or equal to 15. 
> Just because 20 is the max, doesn't mean you should strive for 20.
> * The average number of minutes a project uses per calendar week MUST NOT 
> exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 
> hours).
> * The average number of minutes a project uses in any consecutive five-day 
> period MUST NOT exceed the equivalent of 30 full-time runners (216,000 
> minutes, or 3,600 hours).
> h2. DEADLINE
> bq. 17th of May, 2024
> Since the deadline is 17th of May, 2024, I set this as the highest priority, 
> `Blocker`.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48187) Run `docs` only in PR builders and `build_non_ansi` Daily CI

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48187?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48187:
--
Fix Version/s: 4.0.0

> Run `docs` only in PR builders and `build_non_ansi` Daily CI
> 
>
> Key: SPARK-48187
> URL: https://issues.apache.org/jira/browse/SPARK-48187
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48187) Run `docs` only in PR builders and `build_non_ansi` Daily CI

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48187?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48187:
--
Fix Version/s: (was: 4.0.0)

> Run `docs` only in PR builders and `build_non_ansi` Daily CI
> 
>
> Key: SPARK-48187
> URL: https://issues.apache.org/jira/browse/SPARK-48187
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Reopened] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reopened SPARK-48094:
---
  Assignee: (was: Dongjoon Hyun)

> Reduce GitHub Action usage according to ASF project allowance
> -
>
> Key: SPARK-48094
> URL: https://issues.apache.org/jira/browse/SPARK-48094
> Project: Spark
>  Issue Type: Umbrella
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Priority: Blocker
> Fix For: 4.0.0
>
> Attachments: Screenshot 2024-05-02 at 23.56.05.png
>
>
> h2. ASF INFRA POLICY
> - https://infra.apache.org/github-actions-policy.html
> h2. MONITORING
> - https://infra-reports.apache.org/#ghactions=spark=168
>  !Screenshot 2024-05-02 at 23.56.05.png|width=100%! 
> h2. TARGET
> * All workflows MUST have a job concurrency level less than or equal to 20. 
> This means a workflow cannot have more than 20 jobs running at the same time 
> across all matrices.
> * All workflows SHOULD have a job concurrency level less than or equal to 15. 
> Just because 20 is the max, doesn't mean you should strive for 20.
> * The average number of minutes a project uses per calendar week MUST NOT 
> exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 
> hours).
> * The average number of minutes a project uses in any consecutive five-day 
> period MUST NOT exceed the equivalent of 30 full-time runners (216,000 
> minutes, or 3,600 hours).
> h2. DEADLINE
> bq. 17th of May, 2024
> Since the deadline is 17th of May, 2024, I set this as the highest priority, 
> `Blocker`.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48204) fix release script for Spark 4.0+

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48204?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48204:
-

Assignee: Wenchen Fan

> fix release script for Spark 4.0+
> -
>
> Key: SPARK-48204
> URL: https://issues.apache.org/jira/browse/SPARK-48204
> Project: Spark
>  Issue Type: Bug
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Wenchen Fan
>Assignee: Wenchen Fan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48204) fix release script for Spark 4.0+

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48204?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48204.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46484
[https://github.com/apache/spark/pull/46484]

> fix release script for Spark 4.0+
> -
>
> Key: SPARK-48204
> URL: https://issues.apache.org/jira/browse/SPARK-48204
> Project: Spark
>  Issue Type: Bug
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Wenchen Fan
>Assignee: Wenchen Fan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48094.
---
  Assignee: Dongjoon Hyun
Resolution: Fixed

> Reduce GitHub Action usage according to ASF project allowance
> -
>
> Key: SPARK-48094
> URL: https://issues.apache.org/jira/browse/SPARK-48094
> Project: Spark
>  Issue Type: Umbrella
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Blocker
> Fix For: 4.0.0
>
> Attachments: Screenshot 2024-05-02 at 23.56.05.png
>
>
> h2. ASF INFRA POLICY
> - https://infra.apache.org/github-actions-policy.html
> h2. MONITORING
> - https://infra-reports.apache.org/#ghactions=spark=168
>  !Screenshot 2024-05-02 at 23.56.05.png|width=100%! 
> h2. TARGET
> * All workflows MUST have a job concurrency level less than or equal to 20. 
> This means a workflow cannot have more than 20 jobs running at the same time 
> across all matrices.
> * All workflows SHOULD have a job concurrency level less than or equal to 15. 
> Just because 20 is the max, doesn't mean you should strive for 20.
> * The average number of minutes a project uses per calendar week MUST NOT 
> exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 
> hours).
> * The average number of minutes a project uses in any consecutive five-day 
> period MUST NOT exceed the equivalent of 30 full-time runners (216,000 
> minutes, or 3,600 hours).
> h2. DEADLINE
> bq. 17th of May, 2024
> Since the deadline is 17th of May, 2024, I set this as the highest priority, 
> `Blocker`.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48207) Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48207?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48207.
---
Fix Version/s: 3.4.4
   Resolution: Fixed

Issue resolved by pull request 46489
[https://github.com/apache/spark/pull/46489]

> Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed
> 
>
> Key: SPARK-48207
> URL: https://issues.apache.org/jira/browse/SPARK-48207
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 3.4.4
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.4
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48207) Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48207?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48207:
-

Assignee: Dongjoon Hyun

> Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed
> 
>
> Key: SPARK-48207
> URL: https://issues.apache.org/jira/browse/SPARK-48207
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 3.4.4
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48207) Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48207?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48207:
--
Summary: Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if 
needed  (was: Run build/scala-213/java-11-17 jobs of `branch-3.4` only if 
needed)

> Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed
> 
>
> Key: SPARK-48207
> URL: https://issues.apache.org/jira/browse/SPARK-48207
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 3.4.4
>Reporter: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48207) Run build/scala-213/java-11-17 jobs of `branch-3.4` only if needed

2024-05-08 Thread Dongjoon Hyun (Jira)
Dongjoon Hyun created SPARK-48207:
-

 Summary: Run build/scala-213/java-11-17 jobs of `branch-3.4` only 
if needed
 Key: SPARK-48207
 URL: https://issues.apache.org/jira/browse/SPARK-48207
 Project: Spark
  Issue Type: Sub-task
  Components: Project Infra
Affects Versions: 3.4.4
Reporter: Dongjoon Hyun






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48192) Enable TPC-DS and docker tests in forked repository

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48192?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48192:
--
Fix Version/s: 3.4.4

> Enable TPC-DS and docker tests in forked repository
> ---
>
> Key: SPARK-48192
> URL: https://issues.apache.org/jira/browse/SPARK-48192
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra, SQL
>Affects Versions: 4.0.0
>Reporter: Hyukjin Kwon
>Assignee: Hyukjin Kwon
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2, 3.4.4
>
>
> TPC-DS is pretty important in SQL. Shoud at least enable it in forked 
> repositories (PR builders) which does not consume ASF resource.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48132) Run `k8s-integration-tests` only in PR builder and Daily CIs

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48132?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48132:
--
Fix Version/s: 3.4.4

> Run `k8s-integration-tests` only in PR builder and Daily CIs
> 
>
> Key: SPARK-48132
> URL: https://issues.apache.org/jira/browse/SPARK-48132
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2, 3.4.4
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48192) Enable TPC-DS and docker tests in forked repository

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48192?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48192:
--
Summary: Enable TPC-DS and docker tests in forked repository  (was: Enable 
TPC-DS tests in forked repository)

> Enable TPC-DS and docker tests in forked repository
> ---
>
> Key: SPARK-48192
> URL: https://issues.apache.org/jira/browse/SPARK-48192
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra, SQL
>Affects Versions: 4.0.0
>Reporter: Hyukjin Kwon
>Assignee: Hyukjin Kwon
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>
> TPC-DS is pretty important in SQL. Shoud at least enable it in forked 
> repositories (PR builders) which does not consume ASF resource.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48192) Enable TPC-DS tests in forked repository

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48192?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48192:
--
Fix Version/s: 3.5.2

> Enable TPC-DS tests in forked repository
> 
>
> Key: SPARK-48192
> URL: https://issues.apache.org/jira/browse/SPARK-48192
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra, SQL
>Affects Versions: 4.0.0
>Reporter: Hyukjin Kwon
>Assignee: Hyukjin Kwon
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>
> TPC-DS is pretty important in SQL. Shoud at least enable it in forked 
> repositories (PR builders) which does not consume ASF resource.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48133) Run `sparkr` only in PR builders and Daily CIs

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48133?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48133:
--
Fix Version/s: 3.5.2

> Run `sparkr` only in PR builders and Daily CIs
> --
>
> Key: SPARK-48133
> URL: https://issues.apache.org/jira/browse/SPARK-48133
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48109) Enable `k8s-integration-tests` only for `kubernetes` module change

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48109?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48109:
--
Fix Version/s: 3.5.2

> Enable `k8s-integration-tests` only for `kubernetes` module change
> --
>
> Key: SPARK-48109
> URL: https://issues.apache.org/jira/browse/SPARK-48109
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>
> Although there is a chance of missing the related core module change, daily 
> CI test coverage will reveal that.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48116) Run `pyspark-pandas*` only in PR builder and Daily Python CIs

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48116?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48116:
--
Fix Version/s: 3.4.4

> Run `pyspark-pandas*` only in PR builder and Daily Python CIs
> -
>
> Key: SPARK-48116
> URL: https://issues.apache.org/jira/browse/SPARK-48116
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2, 3.4.4
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48116) Run `pyspark-pandas*` only in PR builder and Daily Python CIs

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48116?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48116:
--
Fix Version/s: 3.5.2

> Run `pyspark-pandas*` only in PR builder and Daily Python CIs
> -
>
> Key: SPARK-48116
> URL: https://issues.apache.org/jira/browse/SPARK-48116
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48203) Spin off `pyspark` tests from `build_branch34.yml` Daily CI

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48203?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48203.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46480
[https://github.com/apache/spark/pull/46480]

> Spin off `pyspark` tests from `build_branch34.yml` Daily CI
> ---
>
> Key: SPARK-48203
> URL: https://issues.apache.org/jira/browse/SPARK-48203
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48202) Spin off `pyspark` tests from `build_branch35.yml` Daily CI

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48202?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48202.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46479
[https://github.com/apache/spark/pull/46479]

> Spin off `pyspark` tests from `build_branch35.yml` Daily CI
> ---
>
> Key: SPARK-48202
> URL: https://issues.apache.org/jira/browse/SPARK-48202
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (ORC-1715) Bump org.objenesis:objenesis to 3.3

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/ORC-1715?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved ORC-1715.

Fix Version/s: 2.0.1
   2.1.0
   Resolution: Fixed

Issue resolved by pull request 1927
[https://github.com/apache/orc/pull/1927]

> Bump org.objenesis:objenesis to 3.3
> ---
>
> Key: ORC-1715
> URL: https://issues.apache.org/jira/browse/ORC-1715
> Project: ORC
>  Issue Type: Bug
>  Components: Java
>Affects Versions: 2.0.1
>Reporter: William Hyun
>Assignee: William Hyun
>Priority: Minor
> Fix For: 2.0.1, 2.1.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Closed] (SPARK-48149) Serialize `build_python.yml` to run a single Python version per cron schedule

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48149?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-48149.
-

> Serialize `build_python.yml` to run a single Python version per cron schedule
> -
>
> Key: SPARK-48149
> URL: https://issues.apache.org/jira/browse/SPARK-48149
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48149) Serialize `build_python.yml` to run a single Python version per cron schedule

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48149?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48149.
---
Fix Version/s: (was: 4.0.0)
   Resolution: Abandoned

> Serialize `build_python.yml` to run a single Python version per cron schedule
> -
>
> Key: SPARK-48149
> URL: https://issues.apache.org/jira/browse/SPARK-48149
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Reopened] (SPARK-48149) Serialize `build_python.yml` to run a single Python version per cron schedule

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48149?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reopened SPARK-48149:
---

> Serialize `build_python.yml` to run a single Python version per cron schedule
> -
>
> Key: SPARK-48149
> URL: https://issues.apache.org/jira/browse/SPARK-48149
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-48149) Serialize `build_python.yml` to run a single Python version per cron schedule

2024-05-08 Thread Dongjoon Hyun (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-48149?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17844713#comment-17844713
 ] 

Dongjoon Hyun commented on SPARK-48149:
---

This is technically reverted via SPARK-48200

> Serialize `build_python.yml` to run a single Python version per cron schedule
> -
>
> Key: SPARK-48149
> URL: https://issues.apache.org/jira/browse/SPARK-48149
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48200) Split `build_python.yml` into per-version cron jobs

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48200?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48200.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46477
[https://github.com/apache/spark/pull/46477]

> Split `build_python.yml` into per-version cron jobs
> ---
>
> Key: SPARK-48200
> URL: https://issues.apache.org/jira/browse/SPARK-48200
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48149) Serialize `build_python.yml` to run a single Python version per cron schedule

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48149?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48149:
-

Assignee: (was: Dongjoon Hyun)

> Serialize `build_python.yml` to run a single Python version per cron schedule
> -
>
> Key: SPARK-48149
> URL: https://issues.apache.org/jira/browse/SPARK-48149
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



  1   2   3   4   5   6   7   8   9   10   >