[jira] [Resolved] (SPARK-48328) Upgrade `Arrow` to 16.1.0

2024-05-20 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48328?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48328.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46646
[https://github.com/apache/spark/pull/46646]

> Upgrade `Arrow` to 16.1.0
> -
>
> Key: SPARK-48328
> URL: https://issues.apache.org/jira/browse/SPARK-48328
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48017) Add Spark application submission worker for operator

2024-05-20 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48017?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48017:
-

Assignee: Zhou JIANG

> Add Spark application submission worker for operator
> 
>
> Key: SPARK-48017
> URL: https://issues.apache.org/jira/browse/SPARK-48017
> Project: Spark
>  Issue Type: Sub-task
>  Components: k8s
>Affects Versions: kubernetes-operator-0.1.0
>Reporter: Zhou JIANG
>Assignee: Zhou JIANG
>Priority: Major
>  Labels: pull-request-available
>
> Spark Operator needs a submission worker that converts it's application 
> abstraction (Operator API) to k8s resources. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48017) Add Spark application submission worker for operator

2024-05-20 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48017?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48017.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 10
[https://github.com/apache/spark-kubernetes-operator/pull/10]

> Add Spark application submission worker for operator
> 
>
> Key: SPARK-48017
> URL: https://issues.apache.org/jira/browse/SPARK-48017
> Project: Spark
>  Issue Type: Sub-task
>  Components: k8s
>Affects Versions: kubernetes-operator-0.1.0
>Reporter: Zhou JIANG
>Assignee: Zhou JIANG
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Spark Operator needs a submission worker that converts it's application 
> abstraction (Operator API) to k8s resources. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48256) Add a rule to check file headers for the java side, and fix inconsistent files

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48256?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48256.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46557
[https://github.com/apache/spark/pull/46557]

> Add a rule to check file headers for the java side, and fix inconsistent files
> --
>
> Key: SPARK-48256
> URL: https://issues.apache.org/jira/browse/SPARK-48256
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48256) Add a rule to check file headers for the java side, and fix inconsistent files

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48256?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48256:
-

Assignee: BingKun Pan

> Add a rule to check file headers for the java side, and fix inconsistent files
> --
>
> Key: SPARK-48256
> URL: https://issues.apache.org/jira/browse/SPARK-48256
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48218) TransportClientFactory.createClient may NPE cause FetchFailedException

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48218?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48218:
-

Assignee: dzcxzl

> TransportClientFactory.createClient may NPE cause FetchFailedException
> --
>
> Key: SPARK-48218
> URL: https://issues.apache.org/jira/browse/SPARK-48218
> Project: Spark
>  Issue Type: Improvement
>  Components: Shuffle
>Affects Versions: 4.0.0
>Reporter: dzcxzl
>Assignee: dzcxzl
>Priority: Minor
>  Labels: pull-request-available
>
> {code:java}
> org.apache.spark.shuffle.FetchFailedException
>   at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:1180)
>   at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:913)
>   at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:84)
>   at 
> org.apache.spark.util.CompletionIterator.next(CompletionIterator.scala:29)
> Caused by: java.lang.NullPointerException
>   at 
> org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:178)
>   at 
> org.apache.spark.network.shuffle.ExternalBlockStoreClient.lambda$fetchBlocks$0(ExternalBlockStoreClient.java:128)
>   at 
> org.apache.spark.network.shuffle.RetryingBlockTransferor.transferAllOutstanding(RetryingBlockTransferor.java:154)
>   at 
> org.apache.spark.network.shuffle.RetryingBlockTransferor.start(RetryingBlockTransferor.java:133)
>   at 
> org.apache.spark.network.shuffle.ExternalBlockStoreClient.fetchBlocks(ExternalBlockStoreClient.java:139)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48218) TransportClientFactory.createClient may NPE cause FetchFailedException

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48218?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48218.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46506
[https://github.com/apache/spark/pull/46506]

> TransportClientFactory.createClient may NPE cause FetchFailedException
> --
>
> Key: SPARK-48218
> URL: https://issues.apache.org/jira/browse/SPARK-48218
> Project: Spark
>  Issue Type: Improvement
>  Components: Shuffle
>Affects Versions: 4.0.0
>Reporter: dzcxzl
>Assignee: dzcxzl
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> {code:java}
> org.apache.spark.shuffle.FetchFailedException
>   at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:1180)
>   at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:913)
>   at 
> org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:84)
>   at 
> org.apache.spark.util.CompletionIterator.next(CompletionIterator.scala:29)
> Caused by: java.lang.NullPointerException
>   at 
> org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:178)
>   at 
> org.apache.spark.network.shuffle.ExternalBlockStoreClient.lambda$fetchBlocks$0(ExternalBlockStoreClient.java:128)
>   at 
> org.apache.spark.network.shuffle.RetryingBlockTransferor.transferAllOutstanding(RetryingBlockTransferor.java:154)
>   at 
> org.apache.spark.network.shuffle.RetryingBlockTransferor.start(RetryingBlockTransferor.java:133)
>   at 
> org.apache.spark.network.shuffle.ExternalBlockStoreClient.fetchBlocks(ExternalBlockStoreClient.java:139)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48049) Upgrade Scala to 2.13.14

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48049?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48049.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46288
[https://github.com/apache/spark/pull/46288]

> Upgrade Scala to 2.13.14
> 
>
> Key: SPARK-48049
> URL: https://issues.apache.org/jira/browse/SPARK-48049
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48285) Update docs for size function and sizeOfNull configuration

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48285?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48285.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46592
[https://github.com/apache/spark/pull/46592]

> Update docs for size function and sizeOfNull configuration
> --
>
> Key: SPARK-48285
> URL: https://issues.apache.org/jira/browse/SPARK-48285
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Kent Yao
>Assignee: Kent Yao
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



Re: [ANNOUNCE] Announcing Apache ORC 2.0.1

2024-05-15 Thread Dongjoon Hyun
Thank you so much, William.

Dongjoon.

On Tue, May 14, 2024 at 10:50 PM William H.  wrote:

> Hi All!
>
> We are happy to announce the availability of Apache ORC 2.0.1!
>
> https://orc.apache.org/news/2024/05/14/ORC-2.0.1/
>
> 2.0.1 is a maintenance release containing bug fixes and tool improvements.
> It's available in Apache Downloads and Maven Central.
>
> https://downloads.apache.org/orc/orc-2.0.1/
> https://repo1.maven.org/maven2/org/apache/orc/orc-core/2.0.1/
>
> Bests,
> William
>


Re: [ANNOUNCE] Announcing Apache ORC 2.0.1

2024-05-15 Thread Dongjoon Hyun
Thank you so much, William.

Dongjoon.

On Tue, May 14, 2024 at 10:50 PM William H.  wrote:

> Hi All!
>
> We are happy to announce the availability of Apache ORC 2.0.1!
>
> https://orc.apache.org/news/2024/05/14/ORC-2.0.1/
>
> 2.0.1 is a maintenance release containing bug fixes and tool improvements.
> It's available in Apache Downloads and Maven Central.
>
> https://downloads.apache.org/orc/orc-2.0.1/
> https://repo1.maven.org/maven2/org/apache/orc/orc-core/2.0.1/
>
> Bests,
> William
>


[jira] [Commented] (SPARK-48238) Spark fail to start due to class o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter

2024-05-15 Thread Dongjoon Hyun (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-48238?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17846701#comment-17846701
 ] 

Dongjoon Hyun commented on SPARK-48238:
---

Hi, [~chengpan] and [~HF] and [~cloud_fan]. Is this true that we need to revert 
SPARK-45522 and SPARK-47118 for only YARN support?
Do you think there is an alternative like we did for Hadoop 2 and Hadoop 3 
support or Hive 1 and Hive 2 support?
For example, can we isolate Jetty issues to YARN module and JettyUtil via 
configurations?

> Spark fail to start due to class 
> o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter
> ---
>
> Key: SPARK-48238
> URL: https://issues.apache.org/jira/browse/SPARK-48238
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Priority: Blocker
>
> I tested the latest master branch, it failed to start on YARN mode
> {code:java}
> dev/make-distribution.sh --tgz -Phive,hive-thriftserver,yarn{code}
>  
> {code:java}
> $ bin/spark-sql --master yarn
> WARNING: Using incubator modules: jdk.incubator.vector
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
> setLogLevel(newLevel).
> 2024-05-10 17:58:17 WARN NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2024-05-10 17:58:18 WARN Client: Neither spark.yarn.jars nor 
> spark.yarn.archive} is set, falling back to uploading libraries under 
> SPARK_HOME.
> 2024-05-10 17:58:25 ERROR SparkContext: Error initializing SparkContext.
> org.sparkproject.jetty.util.MultiException: Multiple exceptions
>     at 
> org.sparkproject.jetty.util.MultiException.ifExceptionThrow(MultiException.java:117)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:751)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:392)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:902)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:306)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:93)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:514) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:81) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:81)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) 
> ~[scala-library-2.13.13.jar:?]
>     at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) 
> ~[scala-library-2.13.13.jar:?]
>     at scala.collection.AbstractIterable.foreach(Iterable.scala:935) 
> ~[scala-library-2.13.13.jar:?]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:81) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:79)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?]
>     at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:79) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at org.apache.spark.SparkContext.$anonfun$new$31(SparkContext.scala:690) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.SparkContext.$anonfun$new$31$adapted(SparkContext.scala:690) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?]
>     at org.apache.spark.SparkContext.(SparkContext.scala:690) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2963) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.

[jira] [Updated] (SPARK-48238) Spark fail to start due to class o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48238?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48238:
--
Description: 
I tested the latest master branch, it failed to start on YARN mode
{code:java}
dev/make-distribution.sh --tgz -Phive,hive-thriftserver,yarn{code}
 
{code:java}
$ bin/spark-sql --master yarn
WARNING: Using incubator modules: jdk.incubator.vector
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
2024-05-10 17:58:17 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
2024-05-10 17:58:18 WARN Client: Neither spark.yarn.jars nor 
spark.yarn.archive} is set, falling back to uploading libraries under 
SPARK_HOME.
2024-05-10 17:58:25 ERROR SparkContext: Error initializing SparkContext.
org.sparkproject.jetty.util.MultiException: Multiple exceptions
    at 
org.sparkproject.jetty.util.MultiException.ifExceptionThrow(MultiException.java:117)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:751)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:392)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:902)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:306)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:93)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:514) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:81) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:81)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) 
~[scala-library-2.13.13.jar:?]
    at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) 
~[scala-library-2.13.13.jar:?]
    at scala.collection.AbstractIterable.foreach(Iterable.scala:935) 
~[scala-library-2.13.13.jar:?]
    at 
org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:81) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:79)
 ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?]
    at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:79) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at org.apache.spark.SparkContext.$anonfun$new$31(SparkContext.scala:690) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.SparkContext.$anonfun$new$31$adapted(SparkContext.scala:690) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?]
    at org.apache.spark.SparkContext.(SparkContext.scala:690) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2963) 
~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:1118)
 ~[spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at scala.Option.getOrElse(Option.scala:201) [scala-library-2.13.13.jar:?]
    at 
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:1112) 
[spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:64) 
[spark-hive-thriftserver_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.(SparkSQLCLIDriver.scala:405)
 [spark-hive-thriftserver_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:162)
 [spark-hive-thriftserver_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
 [spark-hive-thriftserver_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method) ~[?:?]
    at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
 ~[?:?]
    at 

Re: [ANNOUNCE] New ORC committer: Yuanping Wu

2024-05-15 Thread Dongjoon Hyun
Congratulations and welcome, Yuanping.

Dongjoon

On Tue, May 14, 2024 at 9:05 PM William H.  wrote:

> Welcome Yuanping,
>
> Thank you for your contribution to the Apache ORC community and
> congratulations on this new role!
>
> Bests,
> William
>
>
> On Wed, May 15, 2024 at 10:22 AM Gang Wu  wrote:
> >
> > Hi,
> >
> > On behalf of the ORC PMC, I'm happy to announce that Yuanping Wu has
> > accepted an invitation to become a committer on Apache ORC. Welcome,
> > and thank you for your contributions!
> >
> > Cheers,
> > Gang
>


[jira] [Resolved] (SPARK-48279) Upgrade ORC to 2.0.1

2024-05-15 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48279?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48279.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46587
[https://github.com/apache/spark/pull/46587]

> Upgrade ORC to 2.0.1
> 
>
> Key: SPARK-48279
> URL: https://issues.apache.org/jira/browse/SPARK-48279
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: William Hyun
>Assignee: William Hyun
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



Re: [ANNOUNCE] New ORC committer: Shaoyun Chen

2024-05-14 Thread Dongjoon Hyun
Welcome, Shaoyun! :)

Dongjoon.

On Tue, May 14, 2024 at 02:05 Gang Wu  wrote:

> Hi,
>
> On behalf of the ORC PMC, I'm happy to announce that Shaoyun Chen has
> accepted an invitation to become a committer on Apache ORC. Welcome,
> and thank you for your contributions!
>
> Cheers,
> Gang
>


[jira] [Updated] (SPARK-48231) Remove unused CodeHaus Jackson dependencies

2024-05-13 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48231?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48231:
--
Parent: (was: SPARK-47046)
Issue Type: Bug  (was: Sub-task)

> Remove unused CodeHaus Jackson dependencies
> ---
>
> Key: SPARK-48231
> URL: https://issues.apache.org/jira/browse/SPARK-48231
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48230) Remove unused jodd-core

2024-05-13 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48230:
--
Parent: (was: SPARK-47046)
Issue Type: Bug  (was: Sub-task)

> Remove unused jodd-core
> ---
>
> Key: SPARK-48230
> URL: https://issues.apache.org/jira/browse/SPARK-48230
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



Re: [VOTE] SPIP: Stored Procedures API for Catalogs

2024-05-12 Thread Dongjoon Hyun
+1

On Sun, May 12, 2024 at 3:50 PM huaxin gao  wrote:

> +1
>
> On Sat, May 11, 2024 at 4:35 PM L. C. Hsieh  wrote:
>
>> +1
>>
>> On Sat, May 11, 2024 at 3:11 PM Chao Sun  wrote:
>> >
>> > +1
>> >
>> > On Sat, May 11, 2024 at 2:10 PM L. C. Hsieh  wrote:
>> >>
>> >> Hi all,
>> >>
>> >> I’d like to start a vote for SPIP: Stored Procedures API for Catalogs.
>> >>
>> >> Please also refer to:
>> >>
>> >>- Discussion thread:
>> >> https://lists.apache.org/thread/7r04pz544c9qs3gc8q2nyj3fpzfnv8oo
>> >>- JIRA ticket: https://issues.apache.org/jira/browse/SPARK-44167
>> >>- SPIP doc:
>> https://docs.google.com/document/d/1rDcggNl9YNcBECsfgPcoOecHXYZOu29QYFrloo2lPBg/
>> >>
>> >>
>> >> Please vote on the SPIP for the next 72 hours:
>> >>
>> >> [ ] +1: Accept the proposal as an official SPIP
>> >> [ ] +0
>> >> [ ] -1: I don’t think this is a good idea because …
>> >>
>> >>
>> >> Thank you!
>> >>
>> >> Liang-Chi Hsieh
>> >>
>> >> -
>> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> >>
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>


[jira] [Updated] (SPARK-48237) After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be deleted

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48237?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48237:
--
Issue Type: Bug  (was: Improvement)

> After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be 
> deleted 
> 
>
> Key: SPARK-48237
> URL: https://issues.apache.org/jira/browse/SPARK-48237
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2, 3.4.4
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48237) After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be deleted

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48237?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48237:
-

Assignee: BingKun Pan

> After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be 
> deleted 
> 
>
> Key: SPARK-48237
> URL: https://issues.apache.org/jira/browse/SPARK-48237
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48237) After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be deleted

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48237?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48237.
---
Fix Version/s: 3.4.4
   3.5.2
   4.0.0
   Resolution: Fixed

Issue resolved by pull request 46531
[https://github.com/apache/spark/pull/46531]

> After executing `test-dependencies.sh`, the dir `dev/pr-deps` should be 
> deleted 
> 
>
> Key: SPARK-48237
> URL: https://issues.apache.org/jira/browse/SPARK-48237
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 3.4.4, 3.5.2, 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48236) Add `commons-lang:commons-lang:2.6` back to support legacy Hive UDF jars

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48236?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48236.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46528
[https://github.com/apache/spark/pull/46528]

> Add `commons-lang:commons-lang:2.6` back to support legacy Hive UDF jars
> 
>
> Key: SPARK-48236
> URL: https://issues.apache.org/jira/browse/SPARK-48236
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-48230) Remove unused jodd-core

2024-05-10 Thread Dongjoon Hyun (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17845522#comment-17845522
 ] 

Dongjoon Hyun commented on SPARK-48230:
---

We will revisit this dependency.

> Remove unused jodd-core
> ---
>
> Key: SPARK-48230
> URL: https://issues.apache.org/jira/browse/SPARK-48230
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48230) Remove unused jodd-core

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48230:
--
Fix Version/s: (was: 4.0.0)

> Remove unused jodd-core
> ---
>
> Key: SPARK-48230
> URL: https://issues.apache.org/jira/browse/SPARK-48230
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



Re: [VOTE] Release Apache ORC 2.0.1 (RC0)

2024-05-10 Thread Dongjoon Hyun
+1

Thank you for leading this, William.

I checked the checksum and sig and ran Java part with `mvnw package -Panalyze`.
It seems that there is no regression.

Although there exists on-going PRs on benchmark module since 2.0.0, we can 
address them later. I'd like to thank Shaoyun (cxzl25), at this chance.

Dongjoon.

On 2024/05/10 18:17:17 "William H." wrote:
> I will start with my +1.
> 
> On Fri, May 10, 2024 at 11:04 AM William H.  wrote:
> >
> > Please vote on releasing the following candidate as Apache ORC version 
> > 2.0.1.
> >
> > [ ] +1 Release this package as Apache ORC 2.0.1
> > [ ] -1 Do not release this package because ...
> >
> > TAG:
> > https://github.com/apache/orc/releases/tag/v2.0.1-rc0
> >
> > RELEASE FILES:
> > https://dist.apache.org/repos/dist/dev/orc/v2.0.1-rc0
> >
> > STAGING REPOSITORY:
> > https://repository.apache.org/content/repositories/orgapacheorc-1080
> >
> > LIST OF ISSUES:
> > https://issues.apache.org/jira/projects/ORC/versions/12354405
> > https://github.com/apache/orc/milestone/29?closed=1
> >
> > This vote will be open for 72 hours.
> >
> > Regards,
> > William
> 


[jira] [Updated] (ORC-1714) Bump commons-csv to 1.11.0

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/ORC-1714?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated ORC-1714:
---
Fix Version/s: 2.0.1

> Bump commons-csv to 1.11.0
> --
>
> Key: ORC-1714
> URL: https://issues.apache.org/jira/browse/ORC-1714
> Project: ORC
>  Issue Type: Bug
>  Components: Java
>Affects Versions: 2.1.0
>Reporter: William Hyun
>Assignee: William Hyun
>Priority: Minor
> Fix For: 2.1.0, 2.0.1
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (ORC-1661) [C++] Better handling when TZDB is unavailable

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/ORC-1661?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated ORC-1661:
---
Fix Version/s: (was: 2.0.1)

> [C++] Better handling when TZDB is unavailable
> --
>
> Key: ORC-1661
> URL: https://issues.apache.org/jira/browse/ORC-1661
> Project: ORC
>  Issue Type: Improvement
>  Components: C++
>Reporter: Gang Wu
>Assignee: Gang Wu
>Priority: Major
> Fix For: 2.1.0
>
>
> When /usr/share/zoneinfo is unavailable and TZDIR env is unset, creating C++ 
> ORC reader will crash on Windows. We need to better deal with this case. See 
> context from the Apache Arrow community: 
> [https://github.com/apache/arrow/issues/36026] and 
> [https://github.com/apache/arrow/issues/40633]
>  
> We could perhaps do following things:
>  * Make sure it does not crash when TZDB is missing (on Windows). The 
> prerequisite is to enable running C++ unit tests on Windows.
>  * More TZDB search locations. (e.g. $CONDA_PREFIX/usr/share/zoneinfo)
>  * Do not eagerly reading TZDB when timestamp types are not required.
>  * Add a runtime config to set TZDB location.
>  * Ship TZDB with the ORC library while building on Windows?



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (SPARK-48144) canPlanAsBroadcastHashJoin should respect shuffle join hints

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48144?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48144.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46401
[https://github.com/apache/spark/pull/46401]

> canPlanAsBroadcastHashJoin should respect shuffle join hints
> 
>
> Key: SPARK-48144
> URL: https://issues.apache.org/jira/browse/SPARK-48144
> Project: Spark
>  Issue Type: Bug
>  Components: Optimizer
>Affects Versions: 4.0.0, 3.5.2, 3.4.4
>Reporter: Fredrik Klauß
>Assignee: Fredrik Klauß
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Currently, `canPlanAsBroadcastHashJoin` incorrectly returns that a join can 
> be planned as a BHJ, even though the join contains a SHJ.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48144) canPlanAsBroadcastHashJoin should respect shuffle join hints

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48144?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48144:
-

Assignee: Fredrik Klauß

> canPlanAsBroadcastHashJoin should respect shuffle join hints
> 
>
> Key: SPARK-48144
> URL: https://issues.apache.org/jira/browse/SPARK-48144
> Project: Spark
>  Issue Type: Bug
>  Components: Optimizer
>Affects Versions: 4.0.0, 3.5.2, 3.4.4
>Reporter: Fredrik Klauß
>Assignee: Fredrik Klauß
>Priority: Major
>  Labels: pull-request-available
>
> Currently, `canPlanAsBroadcastHashJoin` incorrectly returns that a join can 
> be planned as a BHJ, even though the join contains a SHJ.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47441) Do not add log link for unmanaged AM in Spark UI

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47441?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-47441:
-

Assignee: Yuming Wang

> Do not add log link for unmanaged AM in Spark UI
> 
>
> Key: SPARK-47441
> URL: https://issues.apache.org/jira/browse/SPARK-47441
> Project: Spark
>  Issue Type: Bug
>  Components: YARN
>Affects Versions: 3.5.0, 3.5.1
>Reporter: Yuming Wang
>Assignee: Yuming Wang
>Priority: Major
>  Labels: pull-request-available
>
> {noformat}
> 24/03/18 04:58:25,022 ERROR [spark-listener-group-appStatus] 
> scheduler.AsyncEventQueue:97 : Listener AppStatusListener threw an exception
> java.lang.NumberFormatException: For input string: "null"
>   at 
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:67) 
> ~[?:?]
>   at java.lang.Integer.parseInt(Integer.java:668) ~[?:?]
>   at java.lang.Integer.parseInt(Integer.java:786) ~[?:?]
>   at scala.collection.immutable.StringLike.toInt(StringLike.scala:310) 
> ~[scala-library-2.12.18.jar:?]
>   at scala.collection.immutable.StringLike.toInt$(StringLike.scala:310) 
> ~[scala-library-2.12.18.jar:?]
>   at scala.collection.immutable.StringOps.toInt(StringOps.scala:33) 
> ~[scala-library-2.12.18.jar:?]
>   at org.apache.spark.util.Utils$.parseHostPort(Utils.scala:1105) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.ProcessSummaryWrapper.(storeTypes.scala:609) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.LiveMiscellaneousProcess.doUpdate(LiveEntity.scala:1045)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.status.LiveEntity.write(LiveEntity.scala:50) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.AppStatusListener.update(AppStatusListener.scala:1233)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.AppStatusListener.onMiscellaneousProcessAdded(AppStatusListener.scala:1445)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.AppStatusListener.onOtherEvent(AppStatusListener.scala:113)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:100)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) 
> ~[scala-library-2.12.18.jar:?]
>   at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) 
> ~[scala-library-2.12.18.jar:?]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1356) 
> [spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
>  [spark-core_2.12-3.5.1.jar:3.5.1]
> {noformat}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47441) Do not add log link for unmanaged AM in Spark UI

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47441?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-47441.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45565
[https://github.com/apache/spark/pull/45565]

> Do not add log link for unmanaged AM in Spark UI
> 
>
> Key: SPARK-47441
> URL: https://issues.apache.org/jira/browse/SPARK-47441
> Project: Spark
>  Issue Type: Bug
>  Components: YARN
>Affects Versions: 3.5.0, 3.5.1
>Reporter: Yuming Wang
>Assignee: Yuming Wang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> {noformat}
> 24/03/18 04:58:25,022 ERROR [spark-listener-group-appStatus] 
> scheduler.AsyncEventQueue:97 : Listener AppStatusListener threw an exception
> java.lang.NumberFormatException: For input string: "null"
>   at 
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:67) 
> ~[?:?]
>   at java.lang.Integer.parseInt(Integer.java:668) ~[?:?]
>   at java.lang.Integer.parseInt(Integer.java:786) ~[?:?]
>   at scala.collection.immutable.StringLike.toInt(StringLike.scala:310) 
> ~[scala-library-2.12.18.jar:?]
>   at scala.collection.immutable.StringLike.toInt$(StringLike.scala:310) 
> ~[scala-library-2.12.18.jar:?]
>   at scala.collection.immutable.StringOps.toInt(StringOps.scala:33) 
> ~[scala-library-2.12.18.jar:?]
>   at org.apache.spark.util.Utils$.parseHostPort(Utils.scala:1105) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.ProcessSummaryWrapper.(storeTypes.scala:609) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.LiveMiscellaneousProcess.doUpdate(LiveEntity.scala:1045)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.status.LiveEntity.write(LiveEntity.scala:50) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.AppStatusListener.update(AppStatusListener.scala:1233)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.AppStatusListener.onMiscellaneousProcessAdded(AppStatusListener.scala:1445)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.status.AppStatusListener.onOtherEvent(AppStatusListener.scala:113)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:100)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) 
> ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) 
> ~[scala-library-2.12.18.jar:?]
>   at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) 
> ~[scala-library-2.12.18.jar:?]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96)
>  ~[spark-core_2.12-3.5.1.jar:3.5.1]
>   at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1356) 
> [spark-core_2.12-3.5.1.jar:3.5.1]
>   at 
> org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
>  [spark-core_2.12-3.5.1.jar:3.5.1]
> {noformat}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48235:
-

Assignee: Fredrik Klauß  (was: Dongjoon Hyun)

> Directly pass join instead of all arguments to getBroadcastBuildSide and 
> getShuffleHashJoinBuildSide
> 
>
> Key: SPARK-48235
> URL: https://issues.apache.org/jira/browse/SPARK-48235
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Assignee: Fredrik Klauß
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48235:
--
Reporter: Fredrik Klauß  (was: Dongjoon Hyun)

> Directly pass join instead of all arguments to getBroadcastBuildSide and 
> getShuffleHashJoinBuildSide
> 
>
> Key: SPARK-48235
> URL: https://issues.apache.org/jira/browse/SPARK-48235
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Fredrik Klauß
>Assignee: Fredrik Klauß
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48235.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46525
[https://github.com/apache/spark/pull/46525]

> Directly pass join instead of all arguments to getBroadcastBuildSide and 
> getShuffleHashJoinBuildSide
> 
>
> Key: SPARK-48235
> URL: https://issues.apache.org/jira/browse/SPARK-48235
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48235:
-

Assignee: Dongjoon Hyun

> Directly pass join instead of all arguments to getBroadcastBuildSide and 
> getShuffleHashJoinBuildSide
> 
>
> Key: SPARK-48235
> URL: https://issues.apache.org/jira/browse/SPARK-48235
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48235) Directly pass join instead of all arguments to getBroadcastBuildSide and getShuffleHashJoinBuildSide

2024-05-10 Thread Dongjoon Hyun (Jira)
Dongjoon Hyun created SPARK-48235:
-

 Summary: Directly pass join instead of all arguments to 
getBroadcastBuildSide and getShuffleHashJoinBuildSide
 Key: SPARK-48235
 URL: https://issues.apache.org/jira/browse/SPARK-48235
 Project: Spark
  Issue Type: Improvement
  Components: SQL
Affects Versions: 4.0.0
Reporter: Dongjoon Hyun






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48230) Remove unused jodd-core

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48230.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46520
[https://github.com/apache/spark/pull/46520]

> Remove unused jodd-core
> ---
>
> Key: SPARK-48230
> URL: https://issues.apache.org/jira/browse/SPARK-48230
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48231) Remove unused CodeHaus Jackson dependencies

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48231?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48231:
-

Assignee: Cheng Pan

> Remove unused CodeHaus Jackson dependencies
> ---
>
> Key: SPARK-48231
> URL: https://issues.apache.org/jira/browse/SPARK-48231
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48230) Remove unused jodd-core

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48230:
-

Assignee: Cheng Pan

> Remove unused jodd-core
> ---
>
> Key: SPARK-48230
> URL: https://issues.apache.org/jira/browse/SPARK-48230
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48231) Remove unused CodeHaus Jackson dependencies

2024-05-10 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48231?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48231:
--
Parent: SPARK-47046
Issue Type: Sub-task  (was: Improvement)

> Remove unused CodeHaus Jackson dependencies
> ---
>
> Key: SPARK-48231
> URL: https://issues.apache.org/jira/browse/SPARK-48231
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47847) Deprecate spark.network.remoteReadNioBufferConversion

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47847?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-47847.
---
Fix Version/s: 3.5.2
   4.0.0
   Resolution: Fixed

Issue resolved by pull request 46047
[https://github.com/apache/spark/pull/46047]

> Deprecate spark.network.remoteReadNioBufferConversion
> -
>
> Key: SPARK-47847
> URL: https://issues.apache.org/jira/browse/SPARK-47847
> Project: Spark
>  Issue Type: Improvement
>  Components: Shuffle, Spark Core
>Affects Versions: 3.5.2
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.5.2, 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47847) Deprecate spark.network.remoteReadNioBufferConversion

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47847?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-47847:
-

Assignee: Cheng Pan

> Deprecate spark.network.remoteReadNioBufferConversion
> -
>
> Key: SPARK-47847
> URL: https://issues.apache.org/jira/browse/SPARK-47847
> Project: Spark
>  Issue Type: Improvement
>  Components: Shuffle, Spark Core
>Affects Versions: 3.5.2
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47847) Deprecate spark.network.remoteReadNioBufferConversion

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47847?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-47847:
--
Parent: SPARK-44111
Issue Type: Sub-task  (was: Improvement)

> Deprecate spark.network.remoteReadNioBufferConversion
> -
>
> Key: SPARK-47847
> URL: https://issues.apache.org/jira/browse/SPARK-47847
> Project: Spark
>  Issue Type: Sub-task
>  Components: Shuffle, Spark Core
>Affects Versions: 3.5.2
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48230) Remove unused jodd-core

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48230:
--
Parent: SPARK-47046
Issue Type: Sub-task  (was: Improvement)

> Remove unused jodd-core
> ---
>
> Key: SPARK-48230
> URL: https://issues.apache.org/jira/browse/SPARK-48230
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48094:
--
Description: 
h2. ASF INFRA POLICY
 - [https://infra.apache.org/github-actions-policy.html]

h2. MONITORING
 - [https://infra-reports.apache.org/#ghactions=spark=168]

!Screenshot 2024-05-02 at 23.56.05.png|width=100%!

h2. TARGET
 * All workflows MUST have a job concurrency level less than or equal to 20. 
This means a workflow cannot have more than 20 jobs running at the same time 
across all matrices.
 * All workflows SHOULD have a job concurrency level less than or equal to 15. 
Just because 20 is the max, doesn't mean you should strive for 20.
 * The average number of minutes a project uses per calendar week MUST NOT 
exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 hours).
 * The average number of minutes a project uses in any consecutive five-day 
period MUST NOT exceed the equivalent of 30 full-time runners (216,000 minutes, 
or 3,600 hours).

h2. DEADLINE
{quote}17th of May, 2024
{quote}

  was:
h2. ASF INFRA POLICY
 - [https://infra.apache.org/github-actions-policy.html]

h2. MONITORING
 - [https://infra-reports.apache.org/#ghactions=spark=168]

!Screenshot 2024-05-02 at 23.56.05.png|width=100!
h2. TARGET
 * All workflows MUST have a job concurrency level less than or equal to 20. 
This means a workflow cannot have more than 20 jobs running at the same time 
across all matrices.
 * All workflows SHOULD have a job concurrency level less than or equal to 15. 
Just because 20 is the max, doesn't mean you should strive for 20.
 * The average number of minutes a project uses per calendar week MUST NOT 
exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 hours).
 * The average number of minutes a project uses in any consecutive five-day 
period MUST NOT exceed the equivalent of 30 full-time runners (216,000 minutes, 
or 3,600 hours).

h2. DEADLINE
{quote}17th of May, 2024
{quote}


> Reduce GitHub Action usage according to ASF project allowance
> -
>
> Key: SPARK-48094
> URL: https://issues.apache.org/jira/browse/SPARK-48094
> Project: Spark
>  Issue Type: Umbrella
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Priority: Major
> Attachments: Screenshot 2024-05-02 at 23.56.05.png
>
>
> h2. ASF INFRA POLICY
>  - [https://infra.apache.org/github-actions-policy.html]
> h2. MONITORING
>  - [https://infra-reports.apache.org/#ghactions=spark=168]
> !Screenshot 2024-05-02 at 23.56.05.png|width=100%!
> h2. TARGET
>  * All workflows MUST have a job concurrency level less than or equal to 20. 
> This means a workflow cannot have more than 20 jobs running at the same time 
> across all matrices.
>  * All workflows SHOULD have a job concurrency level less than or equal to 
> 15. Just because 20 is the max, doesn't mean you should strive for 20.
>  * The average number of minutes a project uses per calendar week MUST NOT 
> exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 
> hours).
>  * The average number of minutes a project uses in any consecutive five-day 
> period MUST NOT exceed the equivalent of 30 full-time runners (216,000 
> minutes, or 3,600 hours).
> h2. DEADLINE
> {quote}17th of May, 2024
> {quote}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48201) Docstrings of the pyspark DataStream Reader methods are inaccurate

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48201?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48201.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46416
[https://github.com/apache/spark/pull/46416]

> Docstrings of the pyspark DataStream Reader methods are inaccurate
> --
>
> Key: SPARK-48201
> URL: https://issues.apache.org/jira/browse/SPARK-48201
> Project: Spark
>  Issue Type: Documentation
>  Components: PySpark
>Affects Versions: 3.4.3
>Reporter: Chloe He
>Assignee: Chloe He
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> The docstrings of the pyspark DataStream Reader methods {{csv()}} and 
> {{text()}} say that the {{path}} parameter can be a list, but actually when a 
> list is passed an error is raised.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48201) Docstrings of the pyspark DataStream Reader methods are inaccurate

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48201?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48201:
-

Assignee: Chloe He

> Docstrings of the pyspark DataStream Reader methods are inaccurate
> --
>
> Key: SPARK-48201
> URL: https://issues.apache.org/jira/browse/SPARK-48201
> Project: Spark
>  Issue Type: Documentation
>  Components: PySpark
>Affects Versions: 3.4.3
>Reporter: Chloe He
>Assignee: Chloe He
>Priority: Minor
>  Labels: pull-request-available
>
> The docstrings of the pyspark DataStream Reader methods {{csv()}} and 
> {{text()}} say that the {{path}} parameter can be a list, but actually when a 
> list is passed an error is raised.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48228) Implement the missing function validation in ApplyInXXX

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48228?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48228.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46519
[https://github.com/apache/spark/pull/46519]

> Implement the missing function validation in ApplyInXXX
> ---
>
> Key: SPARK-48228
> URL: https://issues.apache.org/jira/browse/SPARK-48228
> Project: Spark
>  Issue Type: Sub-task
>  Components: Connect, PySpark
>Affects Versions: 4.0.0
>Reporter: Ruifeng Zheng
>Assignee: Ruifeng Zheng
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48228) Implement the missing function validation in ApplyInXXX

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48228?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48228:
-

Assignee: Ruifeng Zheng

> Implement the missing function validation in ApplyInXXX
> ---
>
> Key: SPARK-48228
> URL: https://issues.apache.org/jira/browse/SPARK-48228
> Project: Spark
>  Issue Type: Sub-task
>  Components: Connect, PySpark
>Affects Versions: 4.0.0
>Reporter: Ruifeng Zheng
>Assignee: Ruifeng Zheng
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48224) Disable variant from being a part of a map key

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48224?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48224.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46516
[https://github.com/apache/spark/pull/46516]

> Disable variant from being a part of a map key
> --
>
> Key: SPARK-48224
> URL: https://issues.apache.org/jira/browse/SPARK-48224
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Harsh Motwani
>Assignee: Harsh Motwani
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Creating a map object with a variant key currently works. However, this 
> behavior should be disabled.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48224) Disable variant from being a part of a map key

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48224?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48224:
-

Assignee: Harsh Motwani

> Disable variant from being a part of a map key
> --
>
> Key: SPARK-48224
> URL: https://issues.apache.org/jira/browse/SPARK-48224
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Harsh Motwani
>Assignee: Harsh Motwani
>Priority: Major
>  Labels: pull-request-available
>
> Creating a map object with a variant key currently works. However, this 
> behavior should be disabled.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48163) Disable `SparkConnectServiceSuite.SPARK-43923: commands send events - get_resources_command`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48163?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48163:
--
Parent: (was: SPARK-44111)
Issue Type: Bug  (was: Sub-task)

> Disable `SparkConnectServiceSuite.SPARK-43923: commands send events - 
> get_resources_command`
> 
>
> Key: SPARK-48163
> URL: https://issues.apache.org/jira/browse/SPARK-48163
> Project: Spark
>  Issue Type: Bug
>  Components: SQL, Tests
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>
> {code}
> - SPARK-43923: commands send events ((get_resources_command {
> [info] }
> [info] ,None)) *** FAILED *** (35 milliseconds)
> [info]   VerifyEvents.this.listener.executeHolder.isDefined was false 
> (SparkConnectServiceSuite.scala:873)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-37626) Upgrade libthrift to 0.15.0

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-37626?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-37626.
-

> Upgrade libthrift to 0.15.0
> ---
>
> Key: SPARK-37626
> URL: https://issues.apache.org/jira/browse/SPARK-37626
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 3.3.0
>Reporter: Bo Zhang
>Priority: Major
>
> Upgrade libthrift to 1.15.0 in order to avoid 
> https://nvd.nist.gov/vuln/detail/CVE-2020-13949.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47018) Upgrade built-in Hive to 2.3.10

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47018?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-47018.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46468
[https://github.com/apache/spark/pull/46468]

> Upgrade built-in Hive to 2.3.10
> ---
>
> Key: SPARK-47018
> URL: https://issues.apache.org/jira/browse/SPARK-47018
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build, SQL
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47834) Mark deprecated functions with `@deprecated` in `SQLImplicits`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47834?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-47834:
--
Parent: SPARK-44111
Issue Type: Sub-task  (was: Improvement)

> Mark deprecated functions with `@deprecated` in `SQLImplicits`
> --
>
> Key: SPARK-47834
> URL: https://issues.apache.org/jira/browse/SPARK-47834
> Project: Spark
>  Issue Type: Sub-task
>  Components: Connect, SQL
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47834) Mark deprecated functions with `@deprecated` in `SQLImplicits`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47834?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-47834.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46029
[https://github.com/apache/spark/pull/46029]

> Mark deprecated functions with `@deprecated` in `SQLImplicits`
> --
>
> Key: SPARK-47834
> URL: https://issues.apache.org/jira/browse/SPARK-47834
> Project: Spark
>  Issue Type: Improvement
>  Components: Connect, SQL
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48227) Document the requirement of seed in protos

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48227?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48227.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46518
[https://github.com/apache/spark/pull/46518]

> Document the requirement of seed in protos
> --
>
> Key: SPARK-48227
> URL: https://issues.apache.org/jira/browse/SPARK-48227
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Ruifeng Zheng
>Assignee: Ruifeng Zheng
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48227) Document the requirement of seed in protos

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48227?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48227:
-

Assignee: Ruifeng Zheng

> Document the requirement of seed in protos
> --
>
> Key: SPARK-48227
> URL: https://issues.apache.org/jira/browse/SPARK-48227
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Ruifeng Zheng
>Assignee: Ruifeng Zheng
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48226) Add `spark-ganglia-lgpl` to `lint-java` & `spark-ganglia-lgpl` and `jvm-profiler` to `sbt-checkstyle`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48226?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48226.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46501
[https://github.com/apache/spark/pull/46501]

> Add `spark-ganglia-lgpl` to `lint-java` & `spark-ganglia-lgpl` and 
> `jvm-profiler` to `sbt-checkstyle`
> -
>
> Key: SPARK-48226
> URL: https://issues.apache.org/jira/browse/SPARK-48226
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48226) Add `spark-ganglia-lgpl` to `lint-java` & `spark-ganglia-lgpl` and `jvm-profiler` to `sbt-checkstyle`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48226?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48226:
-

Assignee: BingKun Pan

> Add `spark-ganglia-lgpl` to `lint-java` & `spark-ganglia-lgpl` and 
> `jvm-profiler` to `sbt-checkstyle`
> -
>
> Key: SPARK-48226
> URL: https://issues.apache.org/jira/browse/SPARK-48226
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48225) Upgrade `sbt` to 1.10.0

2024-05-09 Thread Dongjoon Hyun (Jira)
Dongjoon Hyun created SPARK-48225:
-

 Summary: Upgrade `sbt` to 1.10.0
 Key: SPARK-48225
 URL: https://issues.apache.org/jira/browse/SPARK-48225
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Dongjoon Hyun






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-48164) Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - get_resources_command`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48164?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-48164.
-

> Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - 
> get_resources_command`
> --
>
> Key: SPARK-48164
> URL: https://issues.apache.org/jira/browse/SPARK-48164
> Project: Spark
>  Issue Type: Bug
>  Components: Connect, Tests
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48164) Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - get_resources_command`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48164?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48164:
--
Parent: (was: SPARK-44111)
Issue Type: Bug  (was: Sub-task)

> Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - 
> get_resources_command`
> --
>
> Key: SPARK-48164
> URL: https://issues.apache.org/jira/browse/SPARK-48164
> Project: Spark
>  Issue Type: Bug
>  Components: Connect, Tests
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48164) Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - get_resources_command`

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48164?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48164:
--
Priority: Major  (was: Blocker)

> Re-enable `SparkConnectServiceSuite.SPARK-43923: commands send events - 
> get_resources_command`
> --
>
> Key: SPARK-48164
> URL: https://issues.apache.org/jira/browse/SPARK-48164
> Project: Spark
>  Issue Type: Sub-task
>  Components: Connect, Tests
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47930) Upgrade RoaringBitmap to 1.0.6

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47930?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-47930:
--
Parent: SPARK-47046
Issue Type: Sub-task  (was: Improvement)

> Upgrade RoaringBitmap to 1.0.6
> --
>
> Key: SPARK-47930
> URL: https://issues.apache.org/jira/browse/SPARK-47930
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47982) Update code style' plugins to latest version

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47982?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-47982:
--
Parent: SPARK-47046
Issue Type: Sub-task  (was: Improvement)

> Update code style' plugins to latest version
> 
>
> Key: SPARK-47982
> URL: https://issues.apache.org/jira/browse/SPARK-47982
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



Re: [DISCUSS] Spark 4.0.0 release

2024-05-09 Thread Dongjoon Hyun
Please re-try to upload, Wenchen. ASF Infra team bumped up our upload limit
based on our request.

> Your upload limit has been increased to 650MB

Dongjoon.



On Thu, May 9, 2024 at 8:12 AM Wenchen Fan  wrote:

> I've created a ticket: https://issues.apache.org/jira/browse/INFRA-25776
>
> On Thu, May 9, 2024 at 11:06 PM Dongjoon Hyun 
> wrote:
>
>> In addition, FYI, I was the latest release manager with Apache Spark
>> 3.4.3 (2024-04-15 Vote)
>>
>> According to my work log, I uploaded the following binaries to SVN from
>> EC2 (us-west-2) without any issues.
>>
>> -rw-r--r--.  1 centos centos 311384003 Apr 15 01:29 pyspark-3.4.3.tar.gz
>> -rw-r--r--.  1 centos centos 397870995 Apr 15 00:44
>> spark-3.4.3-bin-hadoop3-scala2.13.tgz
>> -rw-r--r--.  1 centos centos 388930980 Apr 15 01:29
>> spark-3.4.3-bin-hadoop3.tgz
>> -rw-r--r--.  1 centos centos 300786123 Apr 15 01:04
>> spark-3.4.3-bin-without-hadoop.tgz
>> -rw-r--r--.  1 centos centos  32219044 Apr 15 00:23 spark-3.4.3.tgz
>> -rw-r--r--.  1 centos centos356749 Apr 15 01:29 SparkR_3.4.3.tar.gz
>>
>> Since Apache Spark 4.0.0-preview doesn't have Scala 2.12 combination, the
>> total size should be smaller than 3.4.3 binaires.
>>
>> Given that, if there is any INFRA change, that could happen after 4/15.
>>
>> Dongjoon.
>>
>> On Thu, May 9, 2024 at 7:57 AM Dongjoon Hyun 
>> wrote:
>>
>>> Could you file an INFRA JIRA issue with the error message and context
>>> first, Wenchen?
>>>
>>> As you know, if we see something, we had better file a JIRA issue
>>> because it could be not only an Apache Spark project issue but also all ASF
>>> project issues.
>>>
>>> Dongjoon.
>>>
>>>
>>> On Thu, May 9, 2024 at 12:28 AM Wenchen Fan  wrote:
>>>
>>>> UPDATE:
>>>>
>>>> After resolving a few issues in the release scripts, I can finally
>>>> build the release packages. However, I can't upload them to the staging SVN
>>>> repo due to a transmitting error, and it seems like a limitation from the
>>>> server side. I tried it on both my local laptop and remote AWS instance,
>>>> but neither works. These package binaries are like 300-400 MBs, and we just
>>>> did a release last month. Not sure if this is a new limitation due to cost
>>>> saving.
>>>>
>>>> While I'm looking for help to get unblocked, I'm wondering if we can
>>>> upload release packages to a public git repo instead, under the Apache
>>>> account?
>>>>
>>>>>
>>>>>>>>>>>>>


[jira] [Resolved] (SPARK-48216) Remove overrides DockerJDBCIntegrationSuite.connectionTimeout to make related tests configurable

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48216?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48216.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46505
[https://github.com/apache/spark/pull/46505]

> Remove overrides DockerJDBCIntegrationSuite.connectionTimeout to make related 
> tests configurable
> 
>
> Key: SPARK-48216
> URL: https://issues.apache.org/jira/browse/SPARK-48216
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Docker, Tests
>Affects Versions: 4.0.0
>Reporter: Kent Yao
>Assignee: Kent Yao
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48216) Remove overrides DockerJDBCIntegrationSuite.connectionTimeout to make related tests configurable

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48216?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48216:
-

Assignee: Kent Yao

> Remove overrides DockerJDBCIntegrationSuite.connectionTimeout to make related 
> tests configurable
> 
>
> Key: SPARK-48216
> URL: https://issues.apache.org/jira/browse/SPARK-48216
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Docker, Tests
>Affects Versions: 4.0.0
>Reporter: Kent Yao
>Assignee: Kent Yao
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



Re: [DISCUSS] Spark 4.0.0 release

2024-05-09 Thread Dongjoon Hyun
In addition, FYI, I was the latest release manager with Apache Spark 3.4.3
(2024-04-15 Vote)

According to my work log, I uploaded the following binaries to SVN from EC2
(us-west-2) without any issues.

-rw-r--r--.  1 centos centos 311384003 Apr 15 01:29 pyspark-3.4.3.tar.gz
-rw-r--r--.  1 centos centos 397870995 Apr 15 00:44
spark-3.4.3-bin-hadoop3-scala2.13.tgz
-rw-r--r--.  1 centos centos 388930980 Apr 15 01:29
spark-3.4.3-bin-hadoop3.tgz
-rw-r--r--.  1 centos centos 300786123 Apr 15 01:04
spark-3.4.3-bin-without-hadoop.tgz
-rw-r--r--.  1 centos centos  32219044 Apr 15 00:23 spark-3.4.3.tgz
-rw-r--r--.  1 centos centos356749 Apr 15 01:29 SparkR_3.4.3.tar.gz

Since Apache Spark 4.0.0-preview doesn't have Scala 2.12 combination, the
total size should be smaller than 3.4.3 binaires.

Given that, if there is any INFRA change, that could happen after 4/15.

Dongjoon.

On Thu, May 9, 2024 at 7:57 AM Dongjoon Hyun 
wrote:

> Could you file an INFRA JIRA issue with the error message and context
> first, Wenchen?
>
> As you know, if we see something, we had better file a JIRA issue because
> it could be not only an Apache Spark project issue but also all ASF project
> issues.
>
> Dongjoon.
>
>
> On Thu, May 9, 2024 at 12:28 AM Wenchen Fan  wrote:
>
>> UPDATE:
>>
>> After resolving a few issues in the release scripts, I can finally build
>> the release packages. However, I can't upload them to the staging SVN repo
>> due to a transmitting error, and it seems like a limitation from the server
>> side. I tried it on both my local laptop and remote AWS instance, but
>> neither works. These package binaries are like 300-400 MBs, and we just did
>> a release last month. Not sure if this is a new limitation due to cost
>> saving.
>>
>> While I'm looking for help to get unblocked, I'm wondering if we can
>> upload release packages to a public git repo instead, under the Apache
>> account?
>>
>>>
>>>>>>>>>>>


Re: [DISCUSS] Spark 4.0.0 release

2024-05-09 Thread Dongjoon Hyun
;>>> safe
>>>>>>> (there was some concern from earlier release processes).
>>>>>>>
>>>>>>> Twitter: https://twitter.com/holdenkarau
>>>>>>> Books (Learning Spark, High Performance Spark, etc.):
>>>>>>> https://amzn.to/2MaRAG9  <https://amzn.to/2MaRAG9>
>>>>>>> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>>>>>>>
>>>>>>>
>>>>>>> On Tue, May 7, 2024 at 10:55 AM Nimrod Ofek 
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> Sorry for the novice question, Wenchen - the release is done
>>>>>>>> manually from a laptop? Not using a CI CD process on a build server?
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> Nimrod
>>>>>>>>
>>>>>>>> On Tue, May 7, 2024 at 8:50 PM Wenchen Fan 
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> UPDATE:
>>>>>>>>>
>>>>>>>>> Unfortunately, it took me quite some time to set up my laptop and
>>>>>>>>> get it ready for the release process (docker desktop doesn't work 
>>>>>>>>> anymore,
>>>>>>>>> my pgp key is lost, etc.). I'll start the RC process at my tomorrow. 
>>>>>>>>> Thanks
>>>>>>>>> for your patience!
>>>>>>>>>
>>>>>>>>> Wenchen
>>>>>>>>>
>>>>>>>>> On Fri, May 3, 2024 at 7:47 AM yangjie01 
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>> +1
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> *发件人**: *Jungtaek Lim 
>>>>>>>>>> *日期**: *2024年5月2日 星期四 10:21
>>>>>>>>>> *收件人**: *Holden Karau 
>>>>>>>>>> *抄送**: *Chao Sun , Xiao Li <
>>>>>>>>>> gatorsm...@gmail.com>, Tathagata Das ,
>>>>>>>>>> Wenchen Fan , Cheng Pan ,
>>>>>>>>>> Nicholas Chammas , Dongjoon Hyun <
>>>>>>>>>> dongjoon.h...@gmail.com>, Cheng Pan , Spark
>>>>>>>>>> dev list , Anish Shrigondekar <
>>>>>>>>>> anish.shrigonde...@databricks.com>
>>>>>>>>>> *主题**: *Re: [DISCUSS] Spark 4.0.0 release
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> +1 love to see it!
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, May 2, 2024 at 10:08 AM Holden Karau <
>>>>>>>>>> holden.ka...@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>> +1 :) yay previews
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Wed, May 1, 2024 at 5:36 PM Chao Sun 
>>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>> +1
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Wed, May 1, 2024 at 5:23 PM Xiao Li 
>>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>> +1 for next Monday.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> We can do more previews when the other features are ready for
>>>>>>>>>> preview.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Tathagata Das  于2024年5月1日周三 08:46写道:
>>>>>>>>>>
>>>>>>>>>> Next week sounds great! Thank you Wenchen!
>>>>>>>>>>
>>>>>>>>>>
>>>>&g

[jira] [Updated] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48094:
--
Description: 
h2. ASF INFRA POLICY
 - [https://infra.apache.org/github-actions-policy.html]

h2. MONITORING
 - [https://infra-reports.apache.org/#ghactions=spark=168]

!Screenshot 2024-05-02 at 23.56.05.png|width=100!
h2. TARGET
 * All workflows MUST have a job concurrency level less than or equal to 20. 
This means a workflow cannot have more than 20 jobs running at the same time 
across all matrices.
 * All workflows SHOULD have a job concurrency level less than or equal to 15. 
Just because 20 is the max, doesn't mean you should strive for 20.
 * The average number of minutes a project uses per calendar week MUST NOT 
exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 hours).
 * The average number of minutes a project uses in any consecutive five-day 
period MUST NOT exceed the equivalent of 30 full-time runners (216,000 minutes, 
or 3,600 hours).

h2. DEADLINE
{quote}17th of May, 2024
{quote}

  was:
h2. ASF INFRA POLICY
- https://infra.apache.org/github-actions-policy.html

h2. MONITORING
- https://infra-reports.apache.org/#ghactions=spark=168

 !Screenshot 2024-05-02 at 23.56.05.png|width=100%! 

h2. TARGET
* All workflows MUST have a job concurrency level less than or equal to 20. 
This means a workflow cannot have more than 20 jobs running at the same time 
across all matrices.
* All workflows SHOULD have a job concurrency level less than or equal to 15. 
Just because 20 is the max, doesn't mean you should strive for 20.
* The average number of minutes a project uses per calendar week MUST NOT 
exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 hours).
* The average number of minutes a project uses in any consecutive five-day 
period MUST NOT exceed the equivalent of 30 full-time runners (216,000 minutes, 
or 3,600 hours).

h2. DEADLINE
bq. 17th of May, 2024

Since the deadline is 17th of May, 2024, I set this as the highest priority, 
`Blocker`.




> Reduce GitHub Action usage according to ASF project allowance
> -
>
> Key: SPARK-48094
> URL: https://issues.apache.org/jira/browse/SPARK-48094
> Project: Spark
>  Issue Type: Umbrella
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Priority: Major
> Attachments: Screenshot 2024-05-02 at 23.56.05.png
>
>
> h2. ASF INFRA POLICY
>  - [https://infra.apache.org/github-actions-policy.html]
> h2. MONITORING
>  - [https://infra-reports.apache.org/#ghactions=spark=168]
> !Screenshot 2024-05-02 at 23.56.05.png|width=100!
> h2. TARGET
>  * All workflows MUST have a job concurrency level less than or equal to 20. 
> This means a workflow cannot have more than 20 jobs running at the same time 
> across all matrices.
>  * All workflows SHOULD have a job concurrency level less than or equal to 
> 15. Just because 20 is the max, doesn't mean you should strive for 20.
>  * The average number of minutes a project uses per calendar week MUST NOT 
> exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 
> hours).
>  * The average number of minutes a project uses in any consecutive five-day 
> period MUST NOT exceed the equivalent of 30 full-time runners (216,000 
> minutes, or 3,600 hours).
> h2. DEADLINE
> {quote}17th of May, 2024
> {quote}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48094:
--
Priority: Major  (was: Blocker)

> Reduce GitHub Action usage according to ASF project allowance
> -
>
> Key: SPARK-48094
> URL: https://issues.apache.org/jira/browse/SPARK-48094
> Project: Spark
>  Issue Type: Umbrella
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Priority: Major
> Attachments: Screenshot 2024-05-02 at 23.56.05.png
>
>
> h2. ASF INFRA POLICY
> - https://infra.apache.org/github-actions-policy.html
> h2. MONITORING
> - https://infra-reports.apache.org/#ghactions=spark=168
>  !Screenshot 2024-05-02 at 23.56.05.png|width=100%! 
> h2. TARGET
> * All workflows MUST have a job concurrency level less than or equal to 20. 
> This means a workflow cannot have more than 20 jobs running at the same time 
> across all matrices.
> * All workflows SHOULD have a job concurrency level less than or equal to 15. 
> Just because 20 is the max, doesn't mean you should strive for 20.
> * The average number of minutes a project uses per calendar week MUST NOT 
> exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 
> hours).
> * The average number of minutes a project uses in any consecutive five-day 
> period MUST NOT exceed the equivalent of 30 full-time runners (216,000 
> minutes, or 3,600 hours).
> h2. DEADLINE
> bq. 17th of May, 2024
> Since the deadline is 17th of May, 2024, I set this as the highest priority, 
> `Blocker`.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48094:
--
Fix Version/s: (was: 4.0.0)

> Reduce GitHub Action usage according to ASF project allowance
> -
>
> Key: SPARK-48094
> URL: https://issues.apache.org/jira/browse/SPARK-48094
> Project: Spark
>  Issue Type: Umbrella
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Priority: Blocker
> Attachments: Screenshot 2024-05-02 at 23.56.05.png
>
>
> h2. ASF INFRA POLICY
> - https://infra.apache.org/github-actions-policy.html
> h2. MONITORING
> - https://infra-reports.apache.org/#ghactions=spark=168
>  !Screenshot 2024-05-02 at 23.56.05.png|width=100%! 
> h2. TARGET
> * All workflows MUST have a job concurrency level less than or equal to 20. 
> This means a workflow cannot have more than 20 jobs running at the same time 
> across all matrices.
> * All workflows SHOULD have a job concurrency level less than or equal to 15. 
> Just because 20 is the max, doesn't mean you should strive for 20.
> * The average number of minutes a project uses per calendar week MUST NOT 
> exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 
> hours).
> * The average number of minutes a project uses in any consecutive five-day 
> period MUST NOT exceed the equivalent of 30 full-time runners (216,000 
> minutes, or 3,600 hours).
> h2. DEADLINE
> bq. 17th of May, 2024
> Since the deadline is 17th of May, 2024, I set this as the highest priority, 
> `Blocker`.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48187) Run `docs` only in PR builders and `build_non_ansi` Daily CI

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48187?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48187:
--
Fix Version/s: 4.0.0

> Run `docs` only in PR builders and `build_non_ansi` Daily CI
> 
>
> Key: SPARK-48187
> URL: https://issues.apache.org/jira/browse/SPARK-48187
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48187) Run `docs` only in PR builders and `build_non_ansi` Daily CI

2024-05-09 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48187?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48187:
--
Fix Version/s: (was: 4.0.0)

> Run `docs` only in PR builders and `build_non_ansi` Daily CI
> 
>
> Key: SPARK-48187
> URL: https://issues.apache.org/jira/browse/SPARK-48187
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Reopened] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reopened SPARK-48094:
---
  Assignee: (was: Dongjoon Hyun)

> Reduce GitHub Action usage according to ASF project allowance
> -
>
> Key: SPARK-48094
> URL: https://issues.apache.org/jira/browse/SPARK-48094
> Project: Spark
>  Issue Type: Umbrella
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Priority: Blocker
> Fix For: 4.0.0
>
> Attachments: Screenshot 2024-05-02 at 23.56.05.png
>
>
> h2. ASF INFRA POLICY
> - https://infra.apache.org/github-actions-policy.html
> h2. MONITORING
> - https://infra-reports.apache.org/#ghactions=spark=168
>  !Screenshot 2024-05-02 at 23.56.05.png|width=100%! 
> h2. TARGET
> * All workflows MUST have a job concurrency level less than or equal to 20. 
> This means a workflow cannot have more than 20 jobs running at the same time 
> across all matrices.
> * All workflows SHOULD have a job concurrency level less than or equal to 15. 
> Just because 20 is the max, doesn't mean you should strive for 20.
> * The average number of minutes a project uses per calendar week MUST NOT 
> exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 
> hours).
> * The average number of minutes a project uses in any consecutive five-day 
> period MUST NOT exceed the equivalent of 30 full-time runners (216,000 
> minutes, or 3,600 hours).
> h2. DEADLINE
> bq. 17th of May, 2024
> Since the deadline is 17th of May, 2024, I set this as the highest priority, 
> `Blocker`.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48204) fix release script for Spark 4.0+

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48204?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48204:
-

Assignee: Wenchen Fan

> fix release script for Spark 4.0+
> -
>
> Key: SPARK-48204
> URL: https://issues.apache.org/jira/browse/SPARK-48204
> Project: Spark
>  Issue Type: Bug
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Wenchen Fan
>Assignee: Wenchen Fan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48204) fix release script for Spark 4.0+

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48204?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48204.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46484
[https://github.com/apache/spark/pull/46484]

> fix release script for Spark 4.0+
> -
>
> Key: SPARK-48204
> URL: https://issues.apache.org/jira/browse/SPARK-48204
> Project: Spark
>  Issue Type: Bug
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Wenchen Fan
>Assignee: Wenchen Fan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48094) Reduce GitHub Action usage according to ASF project allowance

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48094.
---
  Assignee: Dongjoon Hyun
Resolution: Fixed

> Reduce GitHub Action usage according to ASF project allowance
> -
>
> Key: SPARK-48094
> URL: https://issues.apache.org/jira/browse/SPARK-48094
> Project: Spark
>  Issue Type: Umbrella
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Blocker
> Fix For: 4.0.0
>
> Attachments: Screenshot 2024-05-02 at 23.56.05.png
>
>
> h2. ASF INFRA POLICY
> - https://infra.apache.org/github-actions-policy.html
> h2. MONITORING
> - https://infra-reports.apache.org/#ghactions=spark=168
>  !Screenshot 2024-05-02 at 23.56.05.png|width=100%! 
> h2. TARGET
> * All workflows MUST have a job concurrency level less than or equal to 20. 
> This means a workflow cannot have more than 20 jobs running at the same time 
> across all matrices.
> * All workflows SHOULD have a job concurrency level less than or equal to 15. 
> Just because 20 is the max, doesn't mean you should strive for 20.
> * The average number of minutes a project uses per calendar week MUST NOT 
> exceed the equivalent of 25 full-time runners (250,000 minutes, or 4,200 
> hours).
> * The average number of minutes a project uses in any consecutive five-day 
> period MUST NOT exceed the equivalent of 30 full-time runners (216,000 
> minutes, or 3,600 hours).
> h2. DEADLINE
> bq. 17th of May, 2024
> Since the deadline is 17th of May, 2024, I set this as the highest priority, 
> `Blocker`.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48207) Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48207?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48207.
---
Fix Version/s: 3.4.4
   Resolution: Fixed

Issue resolved by pull request 46489
[https://github.com/apache/spark/pull/46489]

> Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed
> 
>
> Key: SPARK-48207
> URL: https://issues.apache.org/jira/browse/SPARK-48207
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 3.4.4
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.4
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48207) Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48207?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reassigned SPARK-48207:
-

Assignee: Dongjoon Hyun

> Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed
> 
>
> Key: SPARK-48207
> URL: https://issues.apache.org/jira/browse/SPARK-48207
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 3.4.4
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48207) Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48207?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48207:
--
Summary: Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if 
needed  (was: Run build/scala-213/java-11-17 jobs of `branch-3.4` only if 
needed)

> Run `build/scala-213/java-11-17` jobs of `branch-3.4` only if needed
> 
>
> Key: SPARK-48207
> URL: https://issues.apache.org/jira/browse/SPARK-48207
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 3.4.4
>    Reporter: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48207) Run build/scala-213/java-11-17 jobs of `branch-3.4` only if needed

2024-05-08 Thread Dongjoon Hyun (Jira)
Dongjoon Hyun created SPARK-48207:
-

 Summary: Run build/scala-213/java-11-17 jobs of `branch-3.4` only 
if needed
 Key: SPARK-48207
 URL: https://issues.apache.org/jira/browse/SPARK-48207
 Project: Spark
  Issue Type: Sub-task
  Components: Project Infra
Affects Versions: 3.4.4
Reporter: Dongjoon Hyun






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48192) Enable TPC-DS and docker tests in forked repository

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48192?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48192:
--
Fix Version/s: 3.4.4

> Enable TPC-DS and docker tests in forked repository
> ---
>
> Key: SPARK-48192
> URL: https://issues.apache.org/jira/browse/SPARK-48192
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra, SQL
>Affects Versions: 4.0.0
>Reporter: Hyukjin Kwon
>Assignee: Hyukjin Kwon
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2, 3.4.4
>
>
> TPC-DS is pretty important in SQL. Shoud at least enable it in forked 
> repositories (PR builders) which does not consume ASF resource.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48132) Run `k8s-integration-tests` only in PR builder and Daily CIs

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48132?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48132:
--
Fix Version/s: 3.4.4

> Run `k8s-integration-tests` only in PR builder and Daily CIs
> 
>
> Key: SPARK-48132
> URL: https://issues.apache.org/jira/browse/SPARK-48132
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2, 3.4.4
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48192) Enable TPC-DS and docker tests in forked repository

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48192?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48192:
--
Summary: Enable TPC-DS and docker tests in forked repository  (was: Enable 
TPC-DS tests in forked repository)

> Enable TPC-DS and docker tests in forked repository
> ---
>
> Key: SPARK-48192
> URL: https://issues.apache.org/jira/browse/SPARK-48192
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra, SQL
>Affects Versions: 4.0.0
>Reporter: Hyukjin Kwon
>Assignee: Hyukjin Kwon
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>
> TPC-DS is pretty important in SQL. Shoud at least enable it in forked 
> repositories (PR builders) which does not consume ASF resource.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48192) Enable TPC-DS tests in forked repository

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48192?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48192:
--
Fix Version/s: 3.5.2

> Enable TPC-DS tests in forked repository
> 
>
> Key: SPARK-48192
> URL: https://issues.apache.org/jira/browse/SPARK-48192
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra, SQL
>Affects Versions: 4.0.0
>Reporter: Hyukjin Kwon
>Assignee: Hyukjin Kwon
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>
> TPC-DS is pretty important in SQL. Shoud at least enable it in forked 
> repositories (PR builders) which does not consume ASF resource.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48133) Run `sparkr` only in PR builders and Daily CIs

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48133?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48133:
--
Fix Version/s: 3.5.2

> Run `sparkr` only in PR builders and Daily CIs
> --
>
> Key: SPARK-48133
> URL: https://issues.apache.org/jira/browse/SPARK-48133
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48109) Enable `k8s-integration-tests` only for `kubernetes` module change

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48109?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48109:
--
Fix Version/s: 3.5.2

> Enable `k8s-integration-tests` only for `kubernetes` module change
> --
>
> Key: SPARK-48109
> URL: https://issues.apache.org/jira/browse/SPARK-48109
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>
> Although there is a chance of missing the related core module change, daily 
> CI test coverage will reveal that.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



Re: Regarding ORC release 2.0.1

2024-05-08 Thread Dongjoon Hyun
Thank you, William. It sounds good to me.

I only cherry-picked ORC-1714 from main to branch-2.0 for Apache ORC 2.0.1 
release.

Dongjoon.

On 2024/05/08 06:59:38 "William H." wrote:
> Hello All,
> 
> I am preparing to release ORC version 2.0.1 this Thursday, May 9th.
> Please let me know if there is anything that you would like to see
> included in this release.
> 
> Best Regards,
> William
> 


[jira] [Updated] (SPARK-48116) Run `pyspark-pandas*` only in PR builder and Daily Python CIs

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48116?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48116:
--
Fix Version/s: 3.4.4

> Run `pyspark-pandas*` only in PR builder and Daily Python CIs
> -
>
> Key: SPARK-48116
> URL: https://issues.apache.org/jira/browse/SPARK-48116
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2, 3.4.4
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48116) Run `pyspark-pandas*` only in PR builder and Daily Python CIs

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48116?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48116:
--
Fix Version/s: 3.5.2

> Run `pyspark-pandas*` only in PR builder and Daily Python CIs
> -
>
> Key: SPARK-48116
> URL: https://issues.apache.org/jira/browse/SPARK-48116
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48203) Spin off `pyspark` tests from `build_branch34.yml` Daily CI

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48203?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48203.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46480
[https://github.com/apache/spark/pull/46480]

> Spin off `pyspark` tests from `build_branch34.yml` Daily CI
> ---
>
> Key: SPARK-48203
> URL: https://issues.apache.org/jira/browse/SPARK-48203
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48202) Spin off `pyspark` tests from `build_branch35.yml` Daily CI

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48202?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48202.
---
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46479
[https://github.com/apache/spark/pull/46479]

> Spin off `pyspark` tests from `build_branch35.yml` Daily CI
> ---
>
> Key: SPARK-48202
> URL: https://issues.apache.org/jira/browse/SPARK-48202
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (ORC-1715) Bump org.objenesis:objenesis to 3.3

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/ORC-1715?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved ORC-1715.

Fix Version/s: 2.0.1
   2.1.0
   Resolution: Fixed

Issue resolved by pull request 1927
[https://github.com/apache/orc/pull/1927]

> Bump org.objenesis:objenesis to 3.3
> ---
>
> Key: ORC-1715
> URL: https://issues.apache.org/jira/browse/ORC-1715
> Project: ORC
>  Issue Type: Bug
>  Components: Java
>Affects Versions: 2.0.1
>Reporter: William Hyun
>Assignee: William Hyun
>Priority: Minor
> Fix For: 2.0.1, 2.1.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Closed] (SPARK-48149) Serialize `build_python.yml` to run a single Python version per cron schedule

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48149?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun closed SPARK-48149.
-

> Serialize `build_python.yml` to run a single Python version per cron schedule
> -
>
> Key: SPARK-48149
> URL: https://issues.apache.org/jira/browse/SPARK-48149
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48149) Serialize `build_python.yml` to run a single Python version per cron schedule

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48149?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-48149.
---
Fix Version/s: (was: 4.0.0)
   Resolution: Abandoned

> Serialize `build_python.yml` to run a single Python version per cron schedule
> -
>
> Key: SPARK-48149
> URL: https://issues.apache.org/jira/browse/SPARK-48149
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Reopened] (SPARK-48149) Serialize `build_python.yml` to run a single Python version per cron schedule

2024-05-08 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48149?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun reopened SPARK-48149:
---

> Serialize `build_python.yml` to run a single Python version per cron schedule
> -
>
> Key: SPARK-48149
> URL: https://issues.apache.org/jira/browse/SPARK-48149
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>    Reporter: Dongjoon Hyun
>    Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



  1   2   3   4   5   6   7   8   9   10   >