[jira] [Resolved] (SPARK-48238) Spark fail to start due to class o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter

2024-05-20 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48238?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-48238.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46611
[https://github.com/apache/spark/pull/46611]

> Spark fail to start due to class 
> o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter
> ---
>
> Key: SPARK-48238
> URL: https://issues.apache.org/jira/browse/SPARK-48238
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Blocker
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> I tested the latest master branch, it failed to start on YARN mode
> {code:java}
> dev/make-distribution.sh --tgz -Phive,hive-thriftserver,yarn{code}
>  
> {code:java}
> $ bin/spark-sql --master yarn
> WARNING: Using incubator modules: jdk.incubator.vector
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
> setLogLevel(newLevel).
> 2024-05-10 17:58:17 WARN NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2024-05-10 17:58:18 WARN Client: Neither spark.yarn.jars nor 
> spark.yarn.archive} is set, falling back to uploading libraries under 
> SPARK_HOME.
> 2024-05-10 17:58:25 ERROR SparkContext: Error initializing SparkContext.
> org.sparkproject.jetty.util.MultiException: Multiple exceptions
>     at 
> org.sparkproject.jetty.util.MultiException.ifExceptionThrow(MultiException.java:117)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:751)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:392)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:902)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:306)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:93)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:514) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:81) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:81)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) 
> ~[scala-library-2.13.13.jar:?]
>     at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) 
> ~[scala-library-2.13.13.jar:?]
>     at scala.collection.AbstractIterable.foreach(Iterable.scala:935) 
> ~[scala-library-2.13.13.jar:?]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:81) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:79)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?]
>     at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:79) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at org.apache.spark.SparkContext.$anonfun$new$31(SparkContext.scala:690) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.SparkContext.$anonfun$new$31$adapted(SparkContext.scala:690) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?]
>     at org.apache.spark.SparkContext.(SparkContext.scala:690) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2963) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:1118)
>  ~[spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.Option.getOrElse(Option.scala:201) [scala-library-2.13.13.jar:?]
>     at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:1112)
>  [spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> 

[jira] [Assigned] (SPARK-48238) Spark fail to start due to class o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter

2024-05-20 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48238?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-48238:


Assignee: Cheng Pan

> Spark fail to start due to class 
> o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter
> ---
>
> Key: SPARK-48238
> URL: https://issues.apache.org/jira/browse/SPARK-48238
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Blocker
>  Labels: pull-request-available
>
> I tested the latest master branch, it failed to start on YARN mode
> {code:java}
> dev/make-distribution.sh --tgz -Phive,hive-thriftserver,yarn{code}
>  
> {code:java}
> $ bin/spark-sql --master yarn
> WARNING: Using incubator modules: jdk.incubator.vector
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
> setLogLevel(newLevel).
> 2024-05-10 17:58:17 WARN NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2024-05-10 17:58:18 WARN Client: Neither spark.yarn.jars nor 
> spark.yarn.archive} is set, falling back to uploading libraries under 
> SPARK_HOME.
> 2024-05-10 17:58:25 ERROR SparkContext: Error initializing SparkContext.
> org.sparkproject.jetty.util.MultiException: Multiple exceptions
>     at 
> org.sparkproject.jetty.util.MultiException.ifExceptionThrow(MultiException.java:117)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:751)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:392)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:902)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:306)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:93)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:514) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:81) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:81)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) 
> ~[scala-library-2.13.13.jar:?]
>     at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) 
> ~[scala-library-2.13.13.jar:?]
>     at scala.collection.AbstractIterable.foreach(Iterable.scala:935) 
> ~[scala-library-2.13.13.jar:?]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:81) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:79)
>  ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?]
>     at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:79) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at org.apache.spark.SparkContext.$anonfun$new$31(SparkContext.scala:690) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.SparkContext.$anonfun$new$31$adapted(SparkContext.scala:690) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?]
>     at org.apache.spark.SparkContext.(SparkContext.scala:690) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2963) 
> ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:1118)
>  ~[spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at scala.Option.getOrElse(Option.scala:201) [scala-library-2.13.13.jar:?]
>     at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:1112)
>  [spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>     at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:64)
>  

[jira] [Resolved] (SPARK-48242) Upgrade extra-enforcer-rules to 1.8.0

2024-05-20 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48242?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-48242.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46538
[https://github.com/apache/spark/pull/46538]

> Upgrade extra-enforcer-rules to 1.8.0
> -
>
> Key: SPARK-48242
> URL: https://issues.apache.org/jira/browse/SPARK-48242
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48242) Upgrade extra-enforcer-rules to 1.8.0

2024-05-20 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48242?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-48242:


Assignee: BingKun Pan

> Upgrade extra-enforcer-rules to 1.8.0
> -
>
> Key: SPARK-48242
> URL: https://issues.apache.org/jira/browse/SPARK-48242
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48299) Upgrade scala-maven-plugin to 4.9.1

2024-05-15 Thread Yang Jie (Jira)
Yang Jie created SPARK-48299:


 Summary: Upgrade scala-maven-plugin to 4.9.1
 Key: SPARK-48299
 URL: https://issues.apache.org/jira/browse/SPARK-48299
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48274) Upgrade GenJavadoc to 0.19

2024-05-14 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48274?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-48274.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46579
[https://github.com/apache/spark/pull/46579]

>  Upgrade GenJavadoc to 0.19
> ---
>
> Key: SPARK-48274
> URL: https://issues.apache.org/jira/browse/SPARK-48274
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48257) Polish POM for Hive dependencies

2024-05-13 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48257?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-48257.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46558
[https://github.com/apache/spark/pull/46558]

> Polish POM for Hive dependencies
> 
>
> Key: SPARK-48257
> URL: https://issues.apache.org/jira/browse/SPARK-48257
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48257) Polish POM for Hive dependencies

2024-05-13 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48257?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-48257:


Assignee: Cheng Pan

> Polish POM for Hive dependencies
> 
>
> Key: SPARK-48257
> URL: https://issues.apache.org/jira/browse/SPARK-48257
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48171) Clean up the use of deprecated APIs related to `o.rocksdb.Logger`

2024-05-07 Thread Yang Jie (Jira)
Yang Jie created SPARK-48171:


 Summary: Clean up the use of deprecated APIs related to 
`o.rocksdb.Logger`
 Key: SPARK-48171
 URL: https://issues.apache.org/jira/browse/SPARK-48171
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie


{code:java}
/**
 * AbstractLogger constructor.
 *
 * Important: the log level set within
 * the {@link org.rocksdb.Options} instance will be used as
 * maximum log level of RocksDB.
 *
 * @param options {@link org.rocksdb.Options} instance.
 *
 * @deprecated Use {@link Logger#Logger(InfoLogLevel)} instead, e.g. {@code new
 * Logger(options.infoLogLevel())}.
 */
@Deprecated
public Logger(final Options options) {
  this(options.infoLogLevel());
} {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48138) Disable a flaky `SparkSessionE2ESuite.interrupt tag` test

2024-05-05 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48138?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-48138.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46396
[https://github.com/apache/spark/pull/46396]

> Disable a flaky `SparkSessionE2ESuite.interrupt tag` test
> -
>
> Key: SPARK-48138
> URL: https://issues.apache.org/jira/browse/SPARK-48138
> Project: Spark
>  Issue Type: Sub-task
>  Components: Connect, Tests
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> - https://github.com/apache/spark/actions/runs/8962353911/job/24611130573 
> (Master, 5/5)
> - https://github.com/apache/spark/actions/runs/8948176536/job/24581022674 
> (Master, 5/4)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48138) Disable a flaky `SparkSessionE2ESuite.interrupt tag` test

2024-05-05 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48138?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-48138:


Assignee: Dongjoon Hyun

> Disable a flaky `SparkSessionE2ESuite.interrupt tag` test
> -
>
> Key: SPARK-48138
> URL: https://issues.apache.org/jira/browse/SPARK-48138
> Project: Spark
>  Issue Type: Sub-task
>  Components: Connect, Tests
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
>
> - https://github.com/apache/spark/actions/runs/8962353911/job/24611130573 
> (Master, 5/5)
> - https://github.com/apache/spark/actions/runs/8948176536/job/24581022674 
> (Master, 5/4)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48001) Remove unused `private implicit def arrayToArrayWritable` from `SparkContext`

2024-04-26 Thread Yang Jie (Jira)
Yang Jie created SPARK-48001:


 Summary: Remove unused `private implicit def arrayToArrayWritable` 
from `SparkContext`
 Key: SPARK-48001
 URL: https://issues.apache.org/jira/browse/SPARK-48001
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47984) Change `MetricsAggregate/V2Aggregator`'s `serialize/deserialize` to call `SparkSerDeUtils`'s `serialize/deserialize` methods.

2024-04-25 Thread Yang Jie (Jira)
Yang Jie created SPARK-47984:


 Summary: Change `MetricsAggregate/V2Aggregator`'s 
`serialize/deserialize` to call `SparkSerDeUtils`'s `serialize/deserialize` 
methods.
 Key: SPARK-47984
 URL: https://issues.apache.org/jira/browse/SPARK-47984
 Project: Spark
  Issue Type: Improvement
  Components: MLlib, SQL
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47928) Speed up test "Add jar support Ivy URI in SQL"

2024-04-22 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47928?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47928.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46150
[https://github.com/apache/spark/pull/46150]

> Speed up test "Add jar support Ivy URI in SQL"
> --
>
> Key: SPARK-47928
> URL: https://issues.apache.org/jira/browse/SPARK-47928
> Project: Spark
>  Issue Type: Test
>  Components: SQL
>Affects Versions: 3.2.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47928) Speed up test "Add jar support Ivy URI in SQL"

2024-04-22 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47928?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-47928:


Assignee: Cheng Pan

> Speed up test "Add jar support Ivy URI in SQL"
> --
>
> Key: SPARK-47928
> URL: https://issues.apache.org/jira/browse/SPARK-47928
> Project: Spark
>  Issue Type: Test
>  Components: SQL
>Affects Versions: 3.2.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47932) Avoid using legacy commons-lang

2024-04-22 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47932?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47932.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46154
[https://github.com/apache/spark/pull/46154]

> Avoid using legacy commons-lang
> ---
>
> Key: SPARK-47932
> URL: https://issues.apache.org/jira/browse/SPARK-47932
> Project: Spark
>  Issue Type: Test
>  Components: SQL, Tests
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47901) Upgrade commons-text to 1.12.0

2024-04-18 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47901?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-47901:


Assignee: Yang Jie

> Upgrade commons-text to 1.12.0
> --
>
> Key: SPARK-47901
> URL: https://issues.apache.org/jira/browse/SPARK-47901
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>
> https://github.com/apache/commons-text/blob/rel/commons-text-1.12.0/RELEASE-NOTES.txt



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47901) Upgrade commons-text to 1.12.0

2024-04-18 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47901?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47901.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46127
[https://github.com/apache/spark/pull/46127]

> Upgrade commons-text to 1.12.0
> --
>
> Key: SPARK-47901
> URL: https://issues.apache.org/jira/browse/SPARK-47901
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> https://github.com/apache/commons-text/blob/rel/commons-text-1.12.0/RELEASE-NOTES.txt



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47901) UPgrade commons-text to 1.12.0

2024-04-18 Thread Yang Jie (Jira)
Yang Jie created SPARK-47901:


 Summary: UPgrade commons-text to 1.12.0
 Key: SPARK-47901
 URL: https://issues.apache.org/jira/browse/SPARK-47901
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie


https://github.com/apache/commons-text/blob/rel/commons-text-1.12.0/RELEASE-NOTES.txt



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47901) Upgrade commons-text to 1.12.0

2024-04-18 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47901?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-47901:
-
Summary: Upgrade commons-text to 1.12.0  (was: UPgrade commons-text to 
1.12.0)

> Upgrade commons-text to 1.12.0
> --
>
> Key: SPARK-47901
> URL: https://issues.apache.org/jira/browse/SPARK-47901
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>
> https://github.com/apache/commons-text/blob/rel/commons-text-1.12.0/RELEASE-NOTES.txt



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47850) Support converting insert for unpartitioned Hive table

2024-04-18 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47850?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-47850:


Assignee: Cheng Pan

> Support converting insert for unpartitioned Hive table
> --
>
> Key: SPARK-47850
> URL: https://issues.apache.org/jira/browse/SPARK-47850
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47850) Support converting insert for unpartitioned Hive table

2024-04-18 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47850?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47850.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46052
[https://github.com/apache/spark/pull/46052]

> Support converting insert for unpartitioned Hive table
> --
>
> Key: SPARK-47850
> URL: https://issues.apache.org/jira/browse/SPARK-47850
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47887) Remove unused import `spark/connect/common.proto` from `spark/connect/relations.proto`

2024-04-17 Thread Yang Jie (Jira)
Yang Jie created SPARK-47887:


 Summary: Remove unused import `spark/connect/common.proto` from 
`spark/connect/relations.proto`
 Key: SPARK-47887
 URL: https://issues.apache.org/jira/browse/SPARK-47887
 Project: Spark
  Issue Type: Improvement
  Components: Connect
Affects Versions: 4.0.0
Reporter: Yang Jie


fix compile waring:

 
{code:java}
spark/connect/relations.proto:26:1: warning: Import spark/connect/common.proto 
is unused. {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47834) Mark deprecated functions with `@deprecated` in `SQLImplicits`

2024-04-12 Thread Yang Jie (Jira)
Yang Jie created SPARK-47834:


 Summary: Mark deprecated functions with `@deprecated` in 
`SQLImplicits`
 Key: SPARK-47834
 URL: https://issues.apache.org/jira/browse/SPARK-47834
 Project: Spark
  Issue Type: Improvement
  Components: Connect, Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47770) Fix `GenerateMIMAIgnore.isPackagePrivateModule` to return false instead of failing

2024-04-08 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47770?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47770.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45938
[https://github.com/apache/spark/pull/45938]

> Fix `GenerateMIMAIgnore.isPackagePrivateModule` to return false instead of 
> failing
> --
>
> Key: SPARK-47770
> URL: https://issues.apache.org/jira/browse/SPARK-47770
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47709) Upgrade tink to 1.13.0

2024-04-03 Thread Yang Jie (Jira)
Yang Jie created SPARK-47709:


 Summary: Upgrade tink to 1.13.0
 Key: SPARK-47709
 URL: https://issues.apache.org/jira/browse/SPARK-47709
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie


[https://github.com/tink-crypto/tink-java/releases/tag/v1.13.0]

 
 * AES-GCM is now about 20% faster.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47685) Should restore the handling of `Stream` in `RelationalGroupedDataset#toDF`

2024-04-02 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-47685:


Assignee: Yang Jie

> Should restore the handling of `Stream` in `RelationalGroupedDataset#toDF`
> --
>
> Key: SPARK-47685
> URL: https://issues.apache.org/jira/browse/SPARK-47685
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47685) Should restore the handling of `Stream` in `RelationalGroupedDataset#toDF`

2024-04-02 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47685.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45811
[https://github.com/apache/spark/pull/45811]

> Should restore the handling of `Stream` in `RelationalGroupedDataset#toDF`
> --
>
> Key: SPARK-47685
> URL: https://issues.apache.org/jira/browse/SPARK-47685
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47686) Use `=!=` instead of `!==` in `JoinHintSuite`

2024-04-01 Thread Yang Jie (Jira)
Yang Jie created SPARK-47686:


 Summary: Use `=!=` instead of `!==` in `JoinHintSuite`
 Key: SPARK-47686
 URL: https://issues.apache.org/jira/browse/SPARK-47686
 Project: Spark
  Issue Type: Improvement
  Components: SQL, Tests
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47685) Should restore the handling of `Stream` in `RelationalGroupedDataset#toDF`

2024-04-01 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-47685:
-
Summary: Should restore the handling of `Stream` in 
`RelationalGroupedDataset#toDF`  (was: Restore the handling of `Stream` in 
`RelationalGroupedDataset#toDF`)

> Should restore the handling of `Stream` in `RelationalGroupedDataset#toDF`
> --
>
> Key: SPARK-47685
> URL: https://issues.apache.org/jira/browse/SPARK-47685
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47685) Restore the handling of `Stream` in `RelationalGroupedDataset#toDF`

2024-04-01 Thread Yang Jie (Jira)
Yang Jie created SPARK-47685:


 Summary: Restore the handling of `Stream` in 
`RelationalGroupedDataset#toDF`
 Key: SPARK-47685
 URL: https://issues.apache.org/jira/browse/SPARK-47685
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-45593) Building a runnable distribution from master code running spark-sql raise error "java.lang.ClassNotFoundException: org.sparkproject.guava.util.concurrent.internal.Intern

2024-04-01 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-45593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-45593:
-
Affects Version/s: 3.5.1

> Building a runnable distribution from master code running spark-sql raise 
> error "java.lang.ClassNotFoundException: 
> org.sparkproject.guava.util.concurrent.internal.InternalFutureFailureAccess"
> ---
>
> Key: SPARK-45593
> URL: https://issues.apache.org/jira/browse/SPARK-45593
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0, 3.5.1
>Reporter: yikaifei
>Assignee: yikaifei
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>
> Building a runnable distribution from master code running spark-sql raise 
> error "java.lang.ClassNotFoundException: 
> org.sparkproject.guava.util.concurrent.internal.InternalFutureFailureAccess";
> Reproducing steps, first, clone spark master code, then:
>  # Build runnable distribution from master code by : 
> `/dev/make-distribution.sh --name ui --pip --tgz  -Phive -Phive-thriftserver 
> -Pyarn -Pconnect`
>  # Install runnable distribution package
>  # Run `bin/spark-sql`
> Got error:
> {code:java}
>  23/10/18 20:51:46 WARN NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> org/sparkproject/guava/util/concurrent/internal/InternalFutureFailureAccess
>     at java.base/java.lang.ClassLoader.defineClass1(Native Method)
>     at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1012)
>     at 
> java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:862)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:760)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:681)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:639)
>     at 
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
>     at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520)
>     at java.base/java.lang.ClassLoader.defineClass1(Native Method)
>     at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1012)
>     at 
> java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:862)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:760)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:681)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:639)
>     at 
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
>     at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520)
>     at java.base/java.lang.ClassLoader.defineClass1(Native Method)
>     at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1012)
>     at 
> java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:862)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:760)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:681)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:639)
>     at 
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
>     at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520)
>     at 
> org.sparkproject.guava.cache.LocalCache$LoadingValueReference.(LocalCache.java:3511)
>     at 
> org.sparkproject.guava.cache.LocalCache$LoadingValueReference.(LocalCache.java:3515)
>     at 
> org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2168)
>     at 
> org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2079)
>     at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4011)
>     at org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4034)
>     at 
> org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010)
>     at 
> 

[jira] [Updated] (SPARK-45593) Building a runnable distribution from master code running spark-sql raise error "java.lang.ClassNotFoundException: org.sparkproject.guava.util.concurrent.internal.Intern

2024-04-01 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-45593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-45593:
-
Fix Version/s: 3.5.2

> Building a runnable distribution from master code running spark-sql raise 
> error "java.lang.ClassNotFoundException: 
> org.sparkproject.guava.util.concurrent.internal.InternalFutureFailureAccess"
> ---
>
> Key: SPARK-45593
> URL: https://issues.apache.org/jira/browse/SPARK-45593
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: yikaifei
>Assignee: yikaifei
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2
>
>
> Building a runnable distribution from master code running spark-sql raise 
> error "java.lang.ClassNotFoundException: 
> org.sparkproject.guava.util.concurrent.internal.InternalFutureFailureAccess";
> Reproducing steps, first, clone spark master code, then:
>  # Build runnable distribution from master code by : 
> `/dev/make-distribution.sh --name ui --pip --tgz  -Phive -Phive-thriftserver 
> -Pyarn -Pconnect`
>  # Install runnable distribution package
>  # Run `bin/spark-sql`
> Got error:
> {code:java}
>  23/10/18 20:51:46 WARN NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> org/sparkproject/guava/util/concurrent/internal/InternalFutureFailureAccess
>     at java.base/java.lang.ClassLoader.defineClass1(Native Method)
>     at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1012)
>     at 
> java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:862)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:760)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:681)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:639)
>     at 
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
>     at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520)
>     at java.base/java.lang.ClassLoader.defineClass1(Native Method)
>     at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1012)
>     at 
> java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:862)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:760)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:681)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:639)
>     at 
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
>     at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520)
>     at java.base/java.lang.ClassLoader.defineClass1(Native Method)
>     at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1012)
>     at 
> java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:862)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:760)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:681)
>     at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:639)
>     at 
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
>     at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520)
>     at 
> org.sparkproject.guava.cache.LocalCache$LoadingValueReference.(LocalCache.java:3511)
>     at 
> org.sparkproject.guava.cache.LocalCache$LoadingValueReference.(LocalCache.java:3515)
>     at 
> org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2168)
>     at 
> org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2079)
>     at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4011)
>     at org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4034)
>     at 
> org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010)
>     at 
> 

[jira] [Updated] (SPARK-47645) Make Spark build with -release instead of -target

2024-03-29 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47645?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-47645:
-
Description: https://github.com/scala/scala/pull/9982

> Make Spark build with -release instead of -target 
> --
>
> Key: SPARK-47645
> URL: https://issues.apache.org/jira/browse/SPARK-47645
> Project: Spark
>  Issue Type: Improvement
>  Components: Build, Spark Core, SQL, YARN
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>
> https://github.com/scala/scala/pull/9982



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47645) Make Spark build with -release instead of -target

2024-03-29 Thread Yang Jie (Jira)
Yang Jie created SPARK-47645:


 Summary: Make Spark build with -release instead of -target 
 Key: SPARK-47645
 URL: https://issues.apache.org/jira/browse/SPARK-47645
 Project: Spark
  Issue Type: Improvement
  Components: Build, Spark Core, SQL, YARN
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47629) Add `common/variant` and `connector/kinesis-asl` to maven daily test module list

2024-03-28 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47629?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47629.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45754
[https://github.com/apache/spark/pull/45754]

> Add `common/variant` and `connector/kinesis-asl` to maven daily test module 
> list
> 
>
> Key: SPARK-47629
> URL: https://issues.apache.org/jira/browse/SPARK-47629
> Project: Spark
>  Issue Type: Improvement
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47629) Add `common/variant` and `connector/kinesis-asl` to maven daily test module list

2024-03-28 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47629?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-47629:


Assignee: Yang Jie

> Add `common/variant` and `connector/kinesis-asl` to maven daily test module 
> list
> 
>
> Key: SPARK-47629
> URL: https://issues.apache.org/jira/browse/SPARK-47629
> Project: Spark
>  Issue Type: Improvement
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47642) Exclude `org.junit.jupiter` and `org.junit.platform` from `jmock-junit5`

2024-03-28 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47642?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-47642:
-
Summary: Exclude `org.junit.jupiter` and `org.junit.platform` from 
`jmock-junit5`  (was: Exclude `junit-jupiter-api` and `org.junit.platform` from 
`jmock-junit5`)

> Exclude `org.junit.jupiter` and `org.junit.platform` from `jmock-junit5`
> 
>
> Key: SPARK-47642
> URL: https://issues.apache.org/jira/browse/SPARK-47642
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47642) Exclude `junit-jupiter-api` and `org.junit.platform` from `jmock-junit5`

2024-03-28 Thread Yang Jie (Jira)
Yang Jie created SPARK-47642:


 Summary: Exclude `junit-jupiter-api` and `org.junit.platform` from 
`jmock-junit5`
 Key: SPARK-47642
 URL: https://issues.apache.org/jira/browse/SPARK-47642
 Project: Spark
  Issue Type: Bug
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47629) Add `common/variant` and `connector/kinesis-asl` to maven daily test module list

2024-03-28 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47629?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-47629:
-
Summary: Add `common/variant` and `connector/kinesis-asl` to maven daily 
test module list  (was: Add `common/variant` to maven daily test module list)

> Add `common/variant` and `connector/kinesis-asl` to maven daily test module 
> list
> 
>
> Key: SPARK-47629
> URL: https://issues.apache.org/jira/browse/SPARK-47629
> Project: Spark
>  Issue Type: Improvement
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47629) Add `common/variant` to maven daily test module list

2024-03-28 Thread Yang Jie (Jira)
Yang Jie created SPARK-47629:


 Summary: Add `common/variant` to maven daily test module list
 Key: SPARK-47629
 URL: https://issues.apache.org/jira/browse/SPARK-47629
 Project: Spark
  Issue Type: Improvement
  Components: Project Infra
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47610) Always set io.netty.tryReflectionSetAccessible=true

2024-03-26 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47610?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47610.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45733
[https://github.com/apache/spark/pull/45733]

> Always set io.netty.tryReflectionSetAccessible=true
> ---
>
> Key: SPARK-47610
> URL: https://issues.apache.org/jira/browse/SPARK-47610
> Project: Spark
>  Issue Type: Improvement
>  Components: Build, Spark Core
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47610) Always set io.netty.tryReflectionSetAccessible=true

2024-03-26 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47610?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-47610:


Assignee: Cheng Pan

> Always set io.netty.tryReflectionSetAccessible=true
> ---
>
> Key: SPARK-47610
> URL: https://issues.apache.org/jira/browse/SPARK-47610
> Project: Spark
>  Issue Type: Improvement
>  Components: Build, Spark Core
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47536) Upgrade jmock-junit5 to 2.13.1

2024-03-24 Thread Yang Jie (Jira)
Yang Jie created SPARK-47536:


 Summary: Upgrade jmock-junit5 to 2.13.1
 Key: SPARK-47536
 URL: https://issues.apache.org/jira/browse/SPARK-47536
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie


https://github.com/jmock-developers/jmock-library/releases/tag/2.13.1



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47523) Replace Deprecated `JsonParser#getCurrentName` with `JsonParser#currentName`

2024-03-22 Thread Yang Jie (Jira)
Yang Jie created SPARK-47523:


 Summary: Replace Deprecated `JsonParser#getCurrentName` with 
`JsonParser#currentName`
 Key: SPARK-47523
 URL: https://issues.apache.org/jira/browse/SPARK-47523
 Project: Spark
  Issue Type: Improvement
  Components: SQL
Affects Versions: 4.0.0
Reporter: Yang Jie


[https://github.com/FasterXML/jackson-core/blob/8fba680579885bf9cdae72e93f16de557056d6e3/src/main/java/com/fasterxml/jackson/core/JsonParser.java#L1521-L1551]

 
{code:java}
    /**
     * Deprecated alias of {@link #currentName()}.
     *
     * @return Name of the current field in the parsing context
     *
     * @throws IOException for low-level read issues, or
     *   {@link JsonParseException} for decoding problems
     *
     * @deprecated Since 2.17 use {@link #currentName} instead.
     */
    @Deprecated
    public abstract String getCurrentName() throws IOException;    /**
     * Method that can be called to get the name associated with
     * the current token: for {@link JsonToken#FIELD_NAME}s it will
     * be the same as what {@link #getText} returns;
     * for field values it will be preceding field name;
     * and for others (array values, root-level values) null.
     *
     * @return Name of the current field in the parsing context
     *
     * @throws IOException for low-level read issues, or
     *   {@link JsonParseException} for decoding problems
     *
     * @since 2.10
     */
    public String currentName() throws IOException {
        // !!! TODO: switch direction in 2.18 or later
        return getCurrentName();
    } {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46920) Improve executor exit error message on YARN

2024-03-20 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46920?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46920:


Assignee: Cheng Pan

> Improve executor exit error message on YARN
> ---
>
> Key: SPARK-46920
> URL: https://issues.apache.org/jira/browse/SPARK-46920
> Project: Spark
>  Issue Type: Improvement
>  Components: YARN
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46920) Improve executor exit error message on YARN

2024-03-20 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46920?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46920.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44951
[https://github.com/apache/spark/pull/44951]

> Improve executor exit error message on YARN
> ---
>
> Key: SPARK-46920
> URL: https://issues.apache.org/jira/browse/SPARK-46920
> Project: Spark
>  Issue Type: Improvement
>  Components: YARN
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47486) Remove unused private method `getString` from `ArrowDeserializers`

2024-03-20 Thread Yang Jie (Jira)
Yang Jie created SPARK-47486:


 Summary: Remove unused private method `getString` from 
`ArrowDeserializers`
 Key: SPARK-47486
 URL: https://issues.apache.org/jira/browse/SPARK-47486
 Project: Spark
  Issue Type: Improvement
  Components: Connect
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47455) Fix Resource Handling of `scalaStyleOnCompileConfig` in SparkBuild.scala

2024-03-20 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47455?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47455.
--
Fix Version/s: 3.4.3
   3.5.2
   4.0.0
   Resolution: Fixed

Issue resolved by pull request 45582
[https://github.com/apache/spark/pull/45582]

> Fix Resource Handling of `scalaStyleOnCompileConfig` in SparkBuild.scala
> 
>
> Key: SPARK-47455
> URL: https://issues.apache.org/jira/browse/SPARK-47455
> Project: Spark
>  Issue Type: Bug
>  Components: Project Infra
>Affects Versions: 3.4.2, 4.0.0, 3.5.1
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 3.4.3, 3.5.2, 4.0.0
>
>
> [https://github.com/apache/spark/blob/e01ed0da22f24204fe23143032ff39be7f4b56af/project/SparkBuild.scala#L157-L173]
>  
> {code:java}
> val scalaStyleOnCompileConfig: String = {
>     val in = "scalastyle-config.xml"
>     val out = "scalastyle-on-compile.generated.xml"
>     val replacements = Map(
>       """customId="println" level="error -> """customId="println" 
> level="warn
>     )
>     var contents = Source.fromFile(in).getLines.mkString("\n")
>     for ((k, v) <- replacements) {
>       require(contents.contains(k), s"Could not rewrite '$k' in original 
> scalastyle config.")
>       contents = contents.replace(k, v)
>     }
>     new PrintWriter(out) {
>       write(contents)
>       close()
>     }
>     out
>   } {code}
> `Source.fromFile(in)` opens a `BufferedSource` resource handle, but it does 
> not close it.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47455) Fix Resource Handling of `scalaStyleOnCompileConfig` in SparkBuild.scala

2024-03-20 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47455?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-47455:


Assignee: Yang Jie

> Fix Resource Handling of `scalaStyleOnCompileConfig` in SparkBuild.scala
> 
>
> Key: SPARK-47455
> URL: https://issues.apache.org/jira/browse/SPARK-47455
> Project: Spark
>  Issue Type: Bug
>  Components: Project Infra
>Affects Versions: 3.4.2, 4.0.0, 3.5.1
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Minor
>  Labels: pull-request-available
>
> [https://github.com/apache/spark/blob/e01ed0da22f24204fe23143032ff39be7f4b56af/project/SparkBuild.scala#L157-L173]
>  
> {code:java}
> val scalaStyleOnCompileConfig: String = {
>     val in = "scalastyle-config.xml"
>     val out = "scalastyle-on-compile.generated.xml"
>     val replacements = Map(
>       """customId="println" level="error -> """customId="println" 
> level="warn
>     )
>     var contents = Source.fromFile(in).getLines.mkString("\n")
>     for ((k, v) <- replacements) {
>       require(contents.contains(k), s"Could not rewrite '$k' in original 
> scalastyle config.")
>       contents = contents.replace(k, v)
>     }
>     new PrintWriter(out) {
>       write(contents)
>       close()
>     }
>     out
>   } {code}
> `Source.fromFile(in)` opens a `BufferedSource` resource handle, but it does 
> not close it.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47474) Revert change of SPARK-47461 and add some comments

2024-03-19 Thread Yang Jie (Jira)
Yang Jie created SPARK-47474:


 Summary: Revert change of SPARK-47461 and add some comments
 Key: SPARK-47474
 URL: https://issues.apache.org/jira/browse/SPARK-47474
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47461) Remove unused private function `totalRunningTasksPerResourceProfile` from `ExecutorAllocationManager`

2024-03-19 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47461?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-47461:
-
Summary: Remove unused private function 
`totalRunningTasksPerResourceProfile` from `ExecutorAllocationManager`  (was: 
Remove the unused private function `totalRunningTasksPerResourceProfile` from 
`ExecutorAllocationManager`)

> Remove unused private function `totalRunningTasksPerResourceProfile` from 
> `ExecutorAllocationManager`
> -
>
> Key: SPARK-47461
> URL: https://issues.apache.org/jira/browse/SPARK-47461
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Minor
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47461) Remove the unused private function `totalRunningTasksPerResourceProfile` from `ExecutorAllocationManager`

2024-03-19 Thread Yang Jie (Jira)
Yang Jie created SPARK-47461:


 Summary: Remove the unused private function 
`totalRunningTasksPerResourceProfile` from `ExecutorAllocationManager`
 Key: SPARK-47461
 URL: https://issues.apache.org/jira/browse/SPARK-47461
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47455) Fix Resource Handling of `scalaStyleOnCompileConfig` in SparkBuild.scala

2024-03-18 Thread Yang Jie (Jira)
Yang Jie created SPARK-47455:


 Summary: Fix Resource Handling of `scalaStyleOnCompileConfig` in 
SparkBuild.scala
 Key: SPARK-47455
 URL: https://issues.apache.org/jira/browse/SPARK-47455
 Project: Spark
  Issue Type: Bug
  Components: Project Infra
Affects Versions: 3.5.1, 3.4.2, 4.0.0
Reporter: Yang Jie


[https://github.com/apache/spark/blob/e01ed0da22f24204fe23143032ff39be7f4b56af/project/SparkBuild.scala#L157-L173]

 
{code:java}
val scalaStyleOnCompileConfig: String = {
    val in = "scalastyle-config.xml"
    val out = "scalastyle-on-compile.generated.xml"
    val replacements = Map(
      """customId="println" level="error -> """customId="println" 
level="warn
    )
    var contents = Source.fromFile(in).getLines.mkString("\n")
    for ((k, v) <- replacements) {
      require(contents.contains(k), s"Could not rewrite '$k' in original 
scalastyle config.")
      contents = contents.replace(k, v)
    }
    new PrintWriter(out) {
      write(contents)
      close()
    }
    out
  } {code}
`Source.fromFile(in)` opens a `BufferedSource` resource handle, but it does not 
close it.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-47369) Fix performance regression in JDK 17 caused from RocksDB logging

2024-03-13 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-47369?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825945#comment-17825945
 ] 

Yang Jie commented on SPARK-47369:
--

>From the current Spark code, it appears that the {{Logger}} is only set for 
>the {{RocksDB}} instance built for external shuffle db(inRocksDBProvider), and 
>not for other parts. However, it seems that the Spark code does not actively 
>print RocksDB-related logs (perhaps my confirmation method is incorrect, could 
>you provide a way to confirm it? [~neilramaswamy] )

> Fix performance regression in JDK 17 caused from RocksDB logging
> 
>
> Key: SPARK-47369
> URL: https://issues.apache.org/jira/browse/SPARK-47369
> Project: Spark
>  Issue Type: Bug
>  Components: Structured Streaming
>Affects Versions: 3.3.0, 3.3.1, 3.3.3, 3.4.2, 3.3.2, 3.4.0, 3.4.1, 3.5.0, 
> 3.5.1, 3.3.4
>Reporter: Neil Ramaswamy
>Priority: Major
>
> JDK 17 has a performance regression in the JNI's AttachCurrentThread and 
> DetachCurrentThread calls, as reported here: 
> [https://bugs.openjdk.org/browse/JDK-8314859]. You can find a minimal 
> reproduction of the JDK issue in that bug report. I have marked as affected 
> versions 3.3.0^ since that is when JDK 17 started being offered in Spark.
> For context, every time RocksDB logs, it currently [attaches itself to the 
> JVM|https://github.com/facebook/rocksdb/blob/main/java/rocksjni/loggerjnicallback.cc#L140],
>  invokes the RocksDB [logging callback that we 
> specify|https://github.com/apache/spark/blob/8fcef1657a02189f91d5485eabb5b165706cdce9/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/RocksDB.scala#L839],
>  and then [detaches itself from the 
> JVM|https://github.com/facebook/rocksdb/blob/main/java/rocksjni/loggerjnicallback.cc#L170].
>  These attach/detach calls regressed, causing JDK 17 SS queries to run up to 
> 10-15% slower than their respective JDK 8 queries.
> For example, a 100K record/second dropDuplicates had a p95 latency regression 
> of 12%. A regression of 12% and 21% (at the p95) was observed for a query 
> with 1M record/second, 100K keys, 10 second windows, and 0 second watermark.
> Because the Hotspot folks marked this as "Won't fix," one way to fix this is 
> to avoid the JNI entirely and write the RocksDB to stderr. RocksDB [8.11.3 
> natively supports 
> this|https://github.com/facebook/rocksdb/wiki/Logging-in-RocksJava#configuring-a-native-logger]
>  (I implemented that feature in RocksJava). We can configure our RocksDB 
> logger to do its logging this way.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47278) Upgrade rocksdbjni to 8.11.3

2024-03-04 Thread Yang Jie (Jira)
Yang Jie created SPARK-47278:


 Summary: Upgrade rocksdbjni to 8.11.3
 Key: SPARK-47278
 URL: https://issues.apache.org/jira/browse/SPARK-47278
 Project: Spark
  Issue Type: Sub-task
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47269) Upgrade jetty to 11.0.20

2024-03-04 Thread Yang Jie (Jira)
Yang Jie created SPARK-47269:


 Summary: Upgrade jetty to 11.0.20
 Key: SPARK-47269
 URL: https://issues.apache.org/jira/browse/SPARK-47269
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie


fix 
 * [CVE-2024-22201|https://github.com/advisories/GHSA-rggv-cv7r-mw98] - HTTP/2 
connection not closed after idle timeout when TCP congested



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-44173) Make Spark an sbt build only project

2024-03-01 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-44173?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17822747#comment-17822747
 ] 

Yang Jie commented on SPARK-44173:
--

Hi, [~dongjoon] ~

Sorry, I missed the previous message. This Jira was created based on some 
discussions in https://github.com/apache/spark/pull/40317. With the 
establishment of the Maven daily test pipeline, we now have a way to discover 
problems in Maven tests in a timely manner, so the description in this Jira's 
`Description` has become less critical.

I agree with your point, thank you for converting this to a normal Jira :)

> Make Spark an sbt build only project
> 
>
> Key: SPARK-44173
> URL: https://issues.apache.org/jira/browse/SPARK-44173
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Minor
>
> Supporting both Maven and SBT always brings various testing problems and 
> increases the complexity of testing code writing
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47243) Correct the package name of `StateMetadataSource.scala`

2024-02-29 Thread Yang Jie (Jira)
Yang Jie created SPARK-47243:


 Summary: Correct the package name of `StateMetadataSource.scala`
 Key: SPARK-47243
 URL: https://issues.apache.org/jira/browse/SPARK-47243
 Project: Spark
  Issue Type: Improvement
  Components: Structured Streaming
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47229) Change the never changed 'var' to 'val'

2024-02-29 Thread Yang Jie (Jira)
Yang Jie created SPARK-47229:


 Summary: Change the never changed 'var' to 'val'
 Key: SPARK-47229
 URL: https://issues.apache.org/jira/browse/SPARK-47229
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core, SQL, YARN
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46919) Upgrade `grpcio*` and `grpc-java` to 1.62

2024-02-28 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46919?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46919:
-
Parent: SPARK-47046
Issue Type: Sub-task  (was: Improvement)

> Upgrade `grpcio*` and `grpc-java` to 1.62
> -
>
> Key: SPARK-46919
> URL: https://issues.apache.org/jira/browse/SPARK-46919
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build, Connect
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46919) Upgrade `grpcio*` and `grpc-java` to 1.62

2024-02-28 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46919?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46919:
-
Summary: Upgrade `grpcio*` and `grpc-java` to 1.62  (was: Upgrade `grpcio*` 
to 1.60.0 and `grpc-java` to 1.61.0)

> Upgrade `grpcio*` and `grpc-java` to 1.62
> -
>
> Key: SPARK-46919
> URL: https://issues.apache.org/jira/browse/SPARK-46919
> Project: Spark
>  Issue Type: Improvement
>  Components: Build, Connect
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47209) Upgrade slf4j to 2.0.12

2024-02-28 Thread Yang Jie (Jira)
Yang Jie created SPARK-47209:


 Summary: Upgrade slf4j to 2.0.12
 Key: SPARK-47209
 URL: https://issues.apache.org/jira/browse/SPARK-47209
 Project: Spark
  Issue Type: Sub-task
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie


https://www.slf4j.org/news.html#2.0.12



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-47194) Upgrade log4j2 to 2.23.0

2024-02-27 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-47194?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17821543#comment-17821543
 ] 

Yang Jie edited comment on SPARK-47194 at 2/28/24 7:19 AM:
---

It seems that the `-Dlog4j2.debug` option may not be working in 2.23.0, perhaps 
we should skip this upgrade. I have tested the following scenarios:

1. run `dev/make-distribution.sh --tgz` to build a Spark Client
2. add `log4j2.properties` and `spark-defaults.conf` with the same content as  
test case `Verify logging configuration is picked from the provided 
SPARK_CONF_DIR/log4j2.properties`

```
log4j2.properties 
 # This log4j config file is for integration test SparkConfPropagateSuite.
rootLogger.level = debug
rootLogger.appenderRef.stdout.ref = console

appender.console.type = Console
appender.console.name = console
appender.console.target = SYSTEM_ERR
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = %d\{HH:mm:ss.SSS} %p %c: %maxLen\{%m}

{512}

%n%ex\{8}%n
```

```
spark-defaults.conf

spark.driver.extraJavaOptions -Dlog4j2.debug
spark.executor.extraJavaOptions -Dlog4j2.debug
spark.kubernetes.executor.deleteOnTermination false
```

3. run `bin/run-example SparkPi`

When using log4j 2.22.1, we can have the following log:

```
...

TRACE StatusLogger DefaultConfiguration cleaning Appenders from 1 LoggerConfigs.
DEBUG StatusLogger Stopped 
org.apache.logging.log4j.core.config.DefaultConfiguration@384ad17b OK
TRACE StatusLogger Reregistering MBeans after reconfigure. 
Selector=org.apache.logging.log4j.core.selector.ClassLoaderContextSelector@5852c06f
TRACE StatusLogger Reregistering context (1/1): '5ffd2b27' 
org.apache.logging.log4j.core.LoggerContext@31190526
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=StatusLogger'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=ContextSelector'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=Loggers,name=*'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=Appenders,name=*'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=AsyncAppenders,name=*'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=AsyncLoggerRingBuffer'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=Loggers,name=*,subtype=RingBuffer'
DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=5ffd2b27
DEBUG StatusLogger Registering MBean 
org.apache.logging.log4j2:type=5ffd2b27,component=StatusLogger
DEBUG StatusLogger Registering MBean 
org.apache.logging.log4j2:type=5ffd2b27,component=ContextSelector
DEBUG StatusLogger Registering MBean 
org.apache.logging.log4j2:type=5ffd2b27,component=Loggers,name=
DEBUG StatusLogger Registering MBean 
org.apache.logging.log4j2:type=5ffd2b27,component=Appenders,name=console
TRACE StatusLogger Using default SystemClock for timestamps.
DEBUG StatusLogger org.apache.logging.log4j.core.util.SystemClock supports 
precise timestamps.
TRACE StatusLogger Using DummyNanoClock for nanosecond timestamps.
DEBUG StatusLogger Reconfiguration complete for context[name=5ffd2b27] at URI 
/Users/yangjie01/Tools/4.0/spark-4.0.0-SNAPSHOT-bin-3.3.6/conf/log4j2.properties
 (org.apache.logging.log4j.core.LoggerContext@31190526) with optional 
ClassLoader: null
DEBUG StatusLogger Shutdown hook enabled. Registering a new one.
...
```

But when using log4j 2.23.0, no logs related to `StatusLogger` are printed. 

So let's skip this upgrade


was (Author: luciferyang):
It seems that the `-Dlog4j2.debug` option may not be working in 2.23.0, perhaps 
we should skip this upgrade. I have tested the following scenarios:

1. run `dev/make-distribution.sh --tgz` to build a Spark Client
2. add `log4j2.properties` and `spark-defaults.conf` with the same content as  
test case `Verify logging configuration is picked from the provided 
SPARK_CONF_DIR/log4j2.properties`

```
log4j2.properties 

# This log4j config file is for integration test SparkConfPropagateSuite.
rootLogger.level = debug
rootLogger.appenderRef.stdout.ref = console

appender.console.type = Console
appender.console.name = console
appender.console.target = SYSTEM_ERR
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = %d\{HH:mm:ss.SSS} %p %c: 
%maxLen\{%m}{512}%n%ex\{8}%n
```

```
spark-defaults.conf

spark.driver.extraJavaOptions -Dlog4j2.debug
spark.executor.extraJavaOptions -Dlog4j2.debug
spark.kubernetes.executor.deleteOnTermination false

[jira] [Resolved] (SPARK-47194) Upgrade log4j2 to 2.23.0

2024-02-27 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47194?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47194.
--
Resolution: Won't Fix

It seems that the `-Dlog4j2.debug` option may not be working in 2.23.0, perhaps 
we should skip this upgrade. I have tested the following scenarios:

1. run `dev/make-distribution.sh --tgz` to build a Spark Client
2. add `log4j2.properties` and `spark-defaults.conf` with the same content as  
test case `Verify logging configuration is picked from the provided 
SPARK_CONF_DIR/log4j2.properties`

```
log4j2.properties 

# This log4j config file is for integration test SparkConfPropagateSuite.
rootLogger.level = debug
rootLogger.appenderRef.stdout.ref = console

appender.console.type = Console
appender.console.name = console
appender.console.target = SYSTEM_ERR
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = %d\{HH:mm:ss.SSS} %p %c: 
%maxLen\{%m}{512}%n%ex\{8}%n
```

```
spark-defaults.conf

spark.driver.extraJavaOptions -Dlog4j2.debug
spark.executor.extraJavaOptions -Dlog4j2.debug
spark.kubernetes.executor.deleteOnTermination false
```

3. run `bin/run-example SparkPi`

When using log4j 2.22.1, we can have the following log:

```
...

TRACE StatusLogger DefaultConfiguration cleaning Appenders from 1 LoggerConfigs.
DEBUG StatusLogger Stopped 
org.apache.logging.log4j.core.config.DefaultConfiguration@384ad17b OK
TRACE StatusLogger Reregistering MBeans after reconfigure. 
Selector=org.apache.logging.log4j.core.selector.ClassLoaderContextSelector@5852c06f
TRACE StatusLogger Reregistering context (1/1): '5ffd2b27' 
org.apache.logging.log4j.core.LoggerContext@31190526
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=StatusLogger'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=ContextSelector'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=Loggers,name=*'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=Appenders,name=*'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=AsyncAppenders,name=*'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=AsyncLoggerRingBuffer'
TRACE StatusLogger Unregistering but no MBeans found matching 
'org.apache.logging.log4j2:type=5ffd2b27,component=Loggers,name=*,subtype=RingBuffer'
DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=5ffd2b27
DEBUG StatusLogger Registering MBean 
org.apache.logging.log4j2:type=5ffd2b27,component=StatusLogger
DEBUG StatusLogger Registering MBean 
org.apache.logging.log4j2:type=5ffd2b27,component=ContextSelector
DEBUG StatusLogger Registering MBean 
org.apache.logging.log4j2:type=5ffd2b27,component=Loggers,name=
DEBUG StatusLogger Registering MBean 
org.apache.logging.log4j2:type=5ffd2b27,component=Appenders,name=console
TRACE StatusLogger Using default SystemClock for timestamps.
DEBUG StatusLogger org.apache.logging.log4j.core.util.SystemClock supports 
precise timestamps.
TRACE StatusLogger Using DummyNanoClock for nanosecond timestamps.
DEBUG StatusLogger Reconfiguration complete for context[name=5ffd2b27] at URI 
/Users/yangjie01/Tools/4.0/spark-4.0.0-SNAPSHOT-bin-3.3.6/conf/log4j2.properties
 (org.apache.logging.log4j.core.LoggerContext@31190526) with optional 
ClassLoader: null
DEBUG StatusLogger Shutdown hook enabled. Registering a new one.
...
```

But when using log4j 2.23.0, no logs related to `StatusLogger` are printed. 

cc @dongjoon-hyun 

> Upgrade log4j2 to 2.23.0
> 
>
> Key: SPARK-47194
> URL: https://issues.apache.org/jira/browse/SPARK-47194
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47194) Upgrade log4j2 to 2.23.0

2024-02-27 Thread Yang Jie (Jira)
Yang Jie created SPARK-47194:


 Summary: Upgrade log4j2 to 2.23.0
 Key: SPARK-47194
 URL: https://issues.apache.org/jira/browse/SPARK-47194
 Project: Spark
  Issue Type: Sub-task
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47100) Upgrade netty to 4.1.107.Final and netty-tcnative to 2.0.62.Final

2024-02-20 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47100?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47100.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45178
[https://github.com/apache/spark/pull/45178]

> Upgrade netty to 4.1.107.Final and netty-tcnative to 2.0.62.Final
> -
>
> Key: SPARK-47100
> URL: https://issues.apache.org/jira/browse/SPARK-47100
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-47089) Migrate mockito 4 to mockito5

2024-02-18 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-47089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17818338#comment-17818338
 ] 

Yang Jie commented on SPARK-47089:
--

Thanks [~panbingkun] 

> Migrate mockito 4 to mockito5
> -
>
> Key: SPARK-47089
> URL: https://issues.apache.org/jira/browse/SPARK-47089
> Project: Spark
>  Issue Type: Improvement
>  Components: Build, Tests
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47089) Migrate mockito 4 to mockito5

2024-02-18 Thread Yang Jie (Jira)
Yang Jie created SPARK-47089:


 Summary: Migrate mockito 4 to mockito5
 Key: SPARK-47089
 URL: https://issues.apache.org/jira/browse/SPARK-47089
 Project: Spark
  Issue Type: Improvement
  Components: Build, Tests
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47084) Upgrade joda-time to 2.12.7

2024-02-18 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47084?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47084.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45153
[https://github.com/apache/spark/pull/45153]

>  Upgrade joda-time to 2.12.7
> 
>
> Key: SPARK-47084
> URL: https://issues.apache.org/jira/browse/SPARK-47084
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47073) Upgrade several Maven plugins to the latest versions

2024-02-16 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47073?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47073.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45136
[https://github.com/apache/spark/pull/45136]

> Upgrade several Maven plugins to the latest versions
> 
>
> Key: SPARK-47073
> URL: https://issues.apache.org/jira/browse/SPARK-47073
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> * {{versions-maven-plugin}} from 2.16.0 to 2.16.2.
>  * {{maven-enforcer-plugin}} from 3.3.0 to 3.4.1.
>  * {{maven-compiler-plugin}} from 3.11.0 to 3.12.1.
>  * {{maven-surefire-plugin}} from 3.1.2 to 3.2.5.
>  * {{maven-clean-plugin}} from 3.3.1 to 3.3.2.
>  * {{maven-javadoc-plugin}} from 3.5.0 to 3.6.3.
>  * {{maven-shade-plugin}} from 3.5.0 to 3.5.1.
>  * {{maven-dependency-plugin}} from 3.6.0 to 3.6.1.
>  * {{maven-checkstyle-plugin}} from 3.3.0 to 3.3.1.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47025) Switch `Guava 19.0` dependency scope from `provided` to `test`

2024-02-12 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47025?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-47025:


Assignee: Dongjoon Hyun

> Switch `Guava 19.0` dependency scope from `provided` to `test`
> --
>
> Key: SPARK-47025
> URL: https://issues.apache.org/jira/browse/SPARK-47025
> Project: Spark
>  Issue Type: Test
>  Components: Build, SQL, Tests
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47025) Switch `Guava 19.0` dependency scope from `provided` to `test`

2024-02-12 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47025?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47025.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45088
[https://github.com/apache/spark/pull/45088]

> Switch `Guava 19.0` dependency scope from `provided` to `test`
> --
>
> Key: SPARK-47025
> URL: https://issues.apache.org/jira/browse/SPARK-47025
> Project: Spark
>  Issue Type: Test
>  Components: Build, SQL, Tests
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-47016) Upgrade scalatest related dependencies to the 3.2.18 series

2024-02-09 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-47016?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17816030#comment-17816030
 ] 

Yang Jie commented on SPARK-47016:
--

It seems that for `org.scalatestplus:mockito` corresponding to 3.2.18, there is 
only a version for mockito5. Therefore, it may be necessary to first upgrade to 
use mockito5 in order to upgrade this test dependency as a whole.

> Upgrade scalatest related dependencies to the 3.2.18 series
> ---
>
> Key: SPARK-47016
> URL: https://issues.apache.org/jira/browse/SPARK-47016
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47016) Upgrade scalatest related dependencies to the 3.2.18 series

2024-02-09 Thread Yang Jie (Jira)
Yang Jie created SPARK-47016:


 Summary: Upgrade scalatest related dependencies to the 3.2.18 
series
 Key: SPARK-47016
 URL: https://issues.apache.org/jira/browse/SPARK-47016
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47006) Refactor refill() method to isExhausted() in NioBufferedFileInputStream

2024-02-08 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47006?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-47006:
-
Description: Currently, in NioBufferedFileInputStream, the refill() method 
is always invoked in a negated context (!refill()), which can be confusing and 
counter-intuitive. We can refactor the method so that it's no longer necessary 
to invert the result of the method call.

> Refactor refill() method to isExhausted() in NioBufferedFileInputStream
> ---
>
> Key: SPARK-47006
> URL: https://issues.apache.org/jira/browse/SPARK-47006
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Minor
>
> Currently, in NioBufferedFileInputStream, the refill() method is always 
> invoked in a negated context (!refill()), which can be confusing and 
> counter-intuitive. We can refactor the method so that it's no longer 
> necessary to invert the result of the method call.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47006) Refactor refill() method to isExhausted() in NioBufferedFileInputStream

2024-02-08 Thread Yang Jie (Jira)
Yang Jie created SPARK-47006:


 Summary: Refactor refill() method to isExhausted() in 
NioBufferedFileInputStream
 Key: SPARK-47006
 URL: https://issues.apache.org/jira/browse/SPARK-47006
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46615) Support s.c.immutable.ArraySeq in ArrowDeserializers

2024-02-07 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46615?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46615:


Assignee: BingKun Pan

> Support s.c.immutable.ArraySeq in ArrowDeserializers
> 
>
> Key: SPARK-46615
> URL: https://issues.apache.org/jira/browse/SPARK-46615
> Project: Spark
>  Issue Type: Improvement
>  Components: Connect
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46615) Support s.c.immutable.ArraySeq in ArrowDeserializers

2024-02-07 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46615?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46615.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44618
[https://github.com/apache/spark/pull/44618]

> Support s.c.immutable.ArraySeq in ArrowDeserializers
> 
>
> Key: SPARK-46615
> URL: https://issues.apache.org/jira/browse/SPARK-46615
> Project: Spark
>  Issue Type: Improvement
>  Components: Connect
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47005) Refine docstring of `asc_nulls_first/asc_nulls_last/desc_nulls_first/desc_nulls_last`

2024-02-07 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47005?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-47005:


Assignee: Yang Jie

> Refine docstring of 
> `asc_nulls_first/asc_nulls_last/desc_nulls_first/desc_nulls_last`
> -
>
> Key: SPARK-47005
> URL: https://issues.apache.org/jira/browse/SPARK-47005
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47005) Refine docstring of `asc_nulls_first/asc_nulls_last/desc_nulls_first/desc_nulls_last`

2024-02-07 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47005?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-47005.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45066
[https://github.com/apache/spark/pull/45066]

> Refine docstring of 
> `asc_nulls_first/asc_nulls_last/desc_nulls_first/desc_nulls_last`
> -
>
> Key: SPARK-47005
> URL: https://issues.apache.org/jira/browse/SPARK-47005
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-47005) Refine docstring of `asc_nulls_first/asc_nulls_last/desc_nulls_first/desc_nulls_last`

2024-02-07 Thread Yang Jie (Jira)
Yang Jie created SPARK-47005:


 Summary: Refine docstring of 
`asc_nulls_first/asc_nulls_last/desc_nulls_first/desc_nulls_last`
 Key: SPARK-47005
 URL: https://issues.apache.org/jira/browse/SPARK-47005
 Project: Spark
  Issue Type: Sub-task
  Components: Documentation, PySpark
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46987) ProtoUtils.abbreviate avoid unnecessary setField

2024-02-06 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46987?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46987:


Assignee: Ruifeng Zheng

> ProtoUtils.abbreviate avoid unnecessary setField
> 
>
> Key: SPARK-46987
> URL: https://issues.apache.org/jira/browse/SPARK-46987
> Project: Spark
>  Issue Type: Improvement
>  Components: Connect
>Affects Versions: 4.0.0
>Reporter: Ruifeng Zheng
>Assignee: Ruifeng Zheng
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46987) ProtoUtils.abbreviate avoid unnecessary setField

2024-02-06 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46987?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46987.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45045
[https://github.com/apache/spark/pull/45045]

> ProtoUtils.abbreviate avoid unnecessary setField
> 
>
> Key: SPARK-46987
> URL: https://issues.apache.org/jira/browse/SPARK-46987
> Project: Spark
>  Issue Type: Improvement
>  Components: Connect
>Affects Versions: 4.0.0
>Reporter: Ruifeng Zheng
>Assignee: Ruifeng Zheng
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46895) Replace Timer with single thread scheduled executor

2024-02-06 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46895?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46895.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44718
[https://github.com/apache/spark/pull/44718]

> Replace Timer with single thread scheduled executor
> ---
>
> Key: SPARK-46895
> URL: https://issues.apache.org/jira/browse/SPARK-46895
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Jiaan Geng
>Assignee: Jiaan Geng
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Spark exists some Timer.
> We should replace Timer with single thread scheduled executor



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46978) Refine docstring of `sum_distinct/array_agg/count_if`

2024-02-05 Thread Yang Jie (Jira)
Yang Jie created SPARK-46978:


 Summary: Refine docstring of `sum_distinct/array_agg/count_if`
 Key: SPARK-46978
 URL: https://issues.apache.org/jira/browse/SPARK-46978
 Project: Spark
  Issue Type: Sub-task
  Components: Documentation, PySpark
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-44319) Migrate jersey 2 to jersey 3

2024-02-03 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-44319?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17814041#comment-17814041
 ] 

Yang Jie commented on SPARK-44319:
--

done

> Migrate jersey 2 to jersey 3
> 
>
> Key: SPARK-44319
> URL: https://issues.apache.org/jira/browse/SPARK-44319
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build, Spark Core
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Minor
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46970) Rewrite `OpenHashSet#hasher` with pattern matching

2024-02-03 Thread Yang Jie (Jira)
Yang Jie created SPARK-46970:


 Summary: Rewrite `OpenHashSet#hasher` with pattern matching
 Key: SPARK-46970
 URL: https://issues.apache.org/jira/browse/SPARK-46970
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46969) Recover `to_timestamp('366', 'DD')` test case in `datetime-parsing-invalid.sql`

2024-02-03 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46969?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46969.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45011
[https://github.com/apache/spark/pull/45011]

> Recover `to_timestamp('366', 'DD')` test case in 
> `datetime-parsing-invalid.sql`
> ---
>
> Key: SPARK-46969
> URL: https://issues.apache.org/jira/browse/SPARK-46969
> Project: Spark
>  Issue Type: Test
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Trivial
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46969) Recover `to_timestamp('366', 'DD')` test case in `datetime-parsing-invalid.sql`

2024-02-03 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46969?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46969:


Assignee: Dongjoon Hyun

> Recover `to_timestamp('366', 'DD')` test case in 
> `datetime-parsing-invalid.sql`
> ---
>
> Key: SPARK-46969
> URL: https://issues.apache.org/jira/browse/SPARK-46969
> Project: Spark
>  Issue Type: Test
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Trivial
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46939) Simplify IndylambdaScalaClosures#getSerializationProxy

2024-01-31 Thread Yang Jie (Jira)
Yang Jie created SPARK-46939:


 Summary: Simplify IndylambdaScalaClosures#getSerializationProxy
 Key: SPARK-46939
 URL: https://issues.apache.org/jira/browse/SPARK-46939
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-45522) Migrate jetty 9 to jetty 10

2024-01-31 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-45522?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17813043#comment-17813043
 ] 

Yang Jie commented on SPARK-45522:
--

[~HF] I have created SPARK-46938, we can use this ticket to complete the 
upgrade from Jetty 10 to 11 (or 12).

> Migrate jetty 9 to jetty 10
> ---
>
> Key: SPARK-45522
> URL: https://issues.apache.org/jira/browse/SPARK-45522
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: HiuFung Kwok
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Jetty 12 supports JakartaEE 8/JakartaEE 9/JakartaEE 10 simultaneously. But 
> the version span is quite large, need to read the documentation in detail, 
> not sure if it can be completed within the 4.0 cycle, so it's set to low 
> priority.
>  
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46938) Migrate jetty 10 to jetty 11

2024-01-31 Thread Yang Jie (Jira)
Yang Jie created SPARK-46938:


 Summary: Migrate jetty 10 to jetty 11
 Key: SPARK-46938
 URL: https://issues.apache.org/jira/browse/SPARK-46938
 Project: Spark
  Issue Type: Sub-task
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-45522) Migrate jetty 9 to jetty 10

2024-01-31 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-45522?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-45522:
-
Summary: Migrate jetty 9 to jetty 10  (was: Migrate jetty 9 to jetty 12)

> Migrate jetty 9 to jetty 10
> ---
>
> Key: SPARK-45522
> URL: https://issues.apache.org/jira/browse/SPARK-45522
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: HiuFung Kwok
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Jetty 12 supports JakartaEE 8/JakartaEE 9/JakartaEE 10 simultaneously. But 
> the version span is quite large, need to read the documentation in detail, 
> not sure if it can be completed within the 4.0 cycle, so it's set to low 
> priority.
>  
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46918) Replace self-defined variables with Hadoop ContainerExitStatus

2024-01-30 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46918?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46918.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44950
[https://github.com/apache/spark/pull/44950]

> Replace self-defined variables with Hadoop ContainerExitStatus
> --
>
> Key: SPARK-46918
> URL: https://issues.apache.org/jira/browse/SPARK-46918
> Project: Spark
>  Issue Type: Improvement
>  Components: YARN
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46918) Replace self-defined variables with Hadoop ContainerExitStatus

2024-01-30 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46918?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46918:


Assignee: Cheng Pan

> Replace self-defined variables with Hadoop ContainerExitStatus
> --
>
> Key: SPARK-46918
> URL: https://issues.apache.org/jira/browse/SPARK-46918
> Project: Spark
>  Issue Type: Improvement
>  Components: YARN
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46919) Upgrade `grpcio*` to 1.60.0 and `grpc-java` to 1.61.0

2024-01-30 Thread Yang Jie (Jira)
Yang Jie created SPARK-46919:


 Summary: Upgrade `grpcio*` to 1.60.0 and `grpc-java` to 1.61.0
 Key: SPARK-46919
 URL: https://issues.apache.org/jira/browse/SPARK-46919
 Project: Spark
  Issue Type: Improvement
  Components: Build, Connect
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46900) Upgrade slf4j to 2.0.11

2024-01-28 Thread Yang Jie (Jira)
Yang Jie created SPARK-46900:


 Summary: Upgrade slf4j to 2.0.11
 Key: SPARK-46900
 URL: https://issues.apache.org/jira/browse/SPARK-46900
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie


 This release reinstates the `renderLevel()` method in {{SimpleLogger}} which 
was removed by mistake.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46898) Simplify the protobuf function transformation in Planner

2024-01-28 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46898?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46898.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44925
[https://github.com/apache/spark/pull/44925]

> Simplify the protobuf function transformation in Planner
> 
>
> Key: SPARK-46898
> URL: https://issues.apache.org/jira/browse/SPARK-46898
> Project: Spark
>  Issue Type: Improvement
>  Components: Connect
>Affects Versions: 4.0.0
>Reporter: Ruifeng Zheng
>Assignee: Ruifeng Zheng
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46898) Simplify the protobuf function transformation in Planner

2024-01-28 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46898?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46898:


Assignee: Ruifeng Zheng

> Simplify the protobuf function transformation in Planner
> 
>
> Key: SPARK-46898
> URL: https://issues.apache.org/jira/browse/SPARK-46898
> Project: Spark
>  Issue Type: Improvement
>  Components: Connect
>Affects Versions: 4.0.0
>Reporter: Ruifeng Zheng
>Assignee: Ruifeng Zheng
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



  1   2   3   4   5   6   7   8   9   10   >