[jira] [Assigned] (SPARK-48929) View fails with internal error after upgrade causes expected syntax error.

2024-07-22 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48929?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48929:
--

Assignee: Serge Rielau

> View fails with internal error after upgrade causes expected syntax error.
> --
>
> Key: SPARK-48929
> URL: https://issues.apache.org/jira/browse/SPARK-48929
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Serge Rielau
>Assignee: Serge Rielau
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> On older Spark:
> CREATE VIEW v AS SELECT 1 ! IN (2);
> SEELCT * FROM v;
> => true
> Upgrade to Spark 4
> SELECT * FROM v;
> Internal error 
> This makes it hard to debug the problem.
> Rather than assuming that failure to parse a view text is an internal error 
> we should assume something like upgrade broke it and expose the actual error



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48929) View fails with internal error after upgrade causes expected syntax error.

2024-07-22 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48929?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48929.

Resolution: Fixed

Issue resolved by pull request 47405
[https://github.com/apache/spark/pull/47405]

> View fails with internal error after upgrade causes expected syntax error.
> --
>
> Key: SPARK-48929
> URL: https://issues.apache.org/jira/browse/SPARK-48929
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Serge Rielau
>Assignee: Serge Rielau
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> On older Spark:
> CREATE VIEW v AS SELECT 1 ! IN (2);
> SEELCT * FROM v;
> => true
> Upgrade to Spark 4
> SELECT * FROM v;
> Internal error 
> This makes it hard to debug the problem.
> Rather than assuming that failure to parse a view text is an internal error 
> we should assume something like upgrade broke it and expose the actual error



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48592) Add scala style check for logging message inline variables

2024-07-20 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48592?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48592:
--

Assignee: Amanda Liu

> Add scala style check for logging message inline variables
> --
>
> Key: SPARK-48592
> URL: https://issues.apache.org/jira/browse/SPARK-48592
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Amanda Liu
>Assignee: Amanda Liu
>Priority: Minor
>  Labels: pull-request-available
>
> Ban logging messages using logInfo, logWarning, logError containing variables 
> without {{MDC}}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48592) Add scala style check for logging message inline variables

2024-07-20 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48592?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48592.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 47239
[https://github.com/apache/spark/pull/47239]

> Add scala style check for logging message inline variables
> --
>
> Key: SPARK-48592
> URL: https://issues.apache.org/jira/browse/SPARK-48592
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Amanda Liu
>Assignee: Amanda Liu
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Ban logging messages using logInfo, logWarning, logError containing variables 
> without {{MDC}}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48890) Add Streaming related fields to log4j ThreadContext

2024-07-18 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48890?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48890:
--

Assignee: Wei Liu

> Add Streaming related fields to log4j ThreadContext
> ---
>
> Key: SPARK-48890
> URL: https://issues.apache.org/jira/browse/SPARK-48890
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core, SS
>Affects Versions: 4.0.0
>Reporter: Wei Liu
>Assignee: Wei Liu
>Priority: Major
>  Labels: pull-request-available
>
> There are some special informations needed for structured streaming queries. 
> Specifically, each query has a query_id and run_id. Also if using 
> MicrobatchExecution (default), there is a batch_id.
>  
> A (query_id, run_id, batch_id) identifies the microbatch the streaming query 
> runs. Adding these field to a threadContext would help especially when there 
> are multiple queries running. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48890) Add Streaming related fields to log4j ThreadContext

2024-07-18 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48890?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48890.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 47340
[https://github.com/apache/spark/pull/47340]

> Add Streaming related fields to log4j ThreadContext
> ---
>
> Key: SPARK-48890
> URL: https://issues.apache.org/jira/browse/SPARK-48890
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core, SS
>Affects Versions: 4.0.0
>Reporter: Wei Liu
>Assignee: Wei Liu
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> There are some special informations needed for structured streaming queries. 
> Specifically, each query has a query_id and run_id. Also if using 
> MicrobatchExecution (default), there is a batch_id.
>  
> A (query_id, run_id, batch_id) identifies the microbatch the streaming query 
> runs. Adding these field to a threadContext would help especially when there 
> are multiple queries running. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48744) Log entry should be constructed only once

2024-06-27 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48744.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 47135
[https://github.com/apache/spark/pull/47135]

> Log entry should be constructed only once
> -
>
> Key: SPARK-48744
> URL: https://issues.apache.org/jira/browse/SPARK-48744
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
> Fix For: 4.0.0
>
>
> In the current implementation:
>  
> {code:java}
> class LogEntry(messageWithContext: => MessageWithContext) {
>   def message: String = messageWithContext.message
>   def context: java.util.HashMap[String, String] = messageWithContext.context
> }
> def logInfo(entry: LogEntry): Unit = {
>   if (log.isInfoEnabled) {
>     withLogContext(entry.context) {
>       log.info(entry.message)
>     }
>   }
> }{code}
>  
>  
> The field `messageWithContext` is constructed twice, one from `entry.context` 
> and another one from `entry.message`.
> We should improve this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48744) Log entry should be constructed only once

2024-06-27 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-48744:
---
Description: 
In the current implementation:

 
{code:java}
class LogEntry(messageWithContext: => MessageWithContext) {
  def message: String = messageWithContext.message
  def context: java.util.HashMap[String, String] = messageWithContext.context
}

def logInfo(entry: LogEntry): Unit = {
  if (log.isInfoEnabled) {
    withLogContext(entry.context) {
      log.info(entry.message)
    }
  }
}{code}
 

 

The field `messageWithContext` is constructed twice, one from `entry.context` 
and another one from `entry.message`.

We should improve this.

  was:
In the current implementation:

```

class LogEntry(messageWithContext: => MessageWithContext) {

  def message: String = mwc.message

  def context: java.util.HashMap[String, String] = mwc.context
}

def logInfo(entry: LogEntry): Unit = {
  if (log.isInfoEnabled) {
    withLogContext(entry.context) {
      log.info(entry.message)
    }
  }
}

```

The field `messageWithContext` is constructed twice, one from `entry.context` 
and another one from `entry.message`.

We should improve this.


> Log entry should be constructed only once
> -
>
> Key: SPARK-48744
> URL: https://issues.apache.org/jira/browse/SPARK-48744
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>
> In the current implementation:
>  
> {code:java}
> class LogEntry(messageWithContext: => MessageWithContext) {
>   def message: String = messageWithContext.message
>   def context: java.util.HashMap[String, String] = messageWithContext.context
> }
> def logInfo(entry: LogEntry): Unit = {
>   if (log.isInfoEnabled) {
>     withLogContext(entry.context) {
>       log.info(entry.message)
>     }
>   }
> }{code}
>  
>  
> The field `messageWithContext` is constructed twice, one from `entry.context` 
> and another one from `entry.message`.
> We should improve this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48744) Log entry should be constructed only once

2024-06-27 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-48744:
---
Description: 
In the current implementation:

```

class LogEntry(messageWithContext: => MessageWithContext) {

  def message: String = mwc.message

  def context: java.util.HashMap[String, String] = mwc.context
}

def logInfo(entry: LogEntry): Unit = {
  if (log.isInfoEnabled) {
    withLogContext(entry.context) {
      log.info(entry.message)
    }
  }
}

```

The field `messageWithContext` is constructed twice, one from `entry.context` 
and another one from `entry.message`.

We should improve this.

  was:
In the current implementation:

```

class LogEntry(messageWithContext: => MessageWithContext) {
  def message: String = messageWithContext.message

  def context: java.util.HashMap[String, String] = messageWithContext.context
}

def logInfo(entry: LogEntry): Unit = {
    if (log.isInfoEnabled) {
      withLogContext(entry.context) {
        log.info(entry.message)
      }
    }
  }

```

The field `messageWithContext` is constructed twice, one from `entry.context` 
and another one from `entry.message`.

We should improve this.


> Log entry should be constructed only once
> -
>
> Key: SPARK-48744
> URL: https://issues.apache.org/jira/browse/SPARK-48744
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>
> In the current implementation:
> ```
> class LogEntry(messageWithContext: => MessageWithContext) {
>   def message: String = mwc.message
>   def context: java.util.HashMap[String, String] = mwc.context
> }
> def logInfo(entry: LogEntry): Unit = {
>   if (log.isInfoEnabled) {
>     withLogContext(entry.context) {
>       log.info(entry.message)
>     }
>   }
> }
> ```
> The field `messageWithContext` is constructed twice, one from `entry.context` 
> and another one from `entry.message`.
> We should improve this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48744) Log entry should be constructed once

2024-06-27 Thread Gengliang Wang (Jira)
Gengliang Wang created SPARK-48744:
--

 Summary: Log entry should be constructed once
 Key: SPARK-48744
 URL: https://issues.apache.org/jira/browse/SPARK-48744
 Project: Spark
  Issue Type: Sub-task
  Components: SQL
Affects Versions: 4.0.0
Reporter: Gengliang Wang
Assignee: Gengliang Wang


In the current implementation:

```

class LogEntry(messageWithContext: => MessageWithContext) {
  def message: String = messageWithContext.message

  def context: java.util.HashMap[String, String] = messageWithContext.context
}

def logInfo(entry: LogEntry): Unit = {
    if (log.isInfoEnabled) {
      withLogContext(entry.context) {
        log.info(entry.message)
      }
    }
  }

```

The field `messageWithContext` is constructed twice, one from `entry.context` 
and another one from `entry.message`.

We should improve this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48744) Log entry should be constructed only once

2024-06-27 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-48744:
---
Summary: Log entry should be constructed only once  (was: Log entry should 
be constructed once)

> Log entry should be constructed only once
> -
>
> Key: SPARK-48744
> URL: https://issues.apache.org/jira/browse/SPARK-48744
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>
> In the current implementation:
> ```
> class LogEntry(messageWithContext: => MessageWithContext) {
>   def message: String = messageWithContext.message
>   def context: java.util.HashMap[String, String] = messageWithContext.context
> }
> def logInfo(entry: LogEntry): Unit = {
>     if (log.isInfoEnabled) {
>       withLogContext(entry.context) {
>         log.info(entry.message)
>       }
>     }
>   }
> ```
> The field `messageWithContext` is constructed twice, one from `entry.context` 
> and another one from `entry.message`.
> We should improve this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48629) Migrate the remaining code to structured logging framework

2024-06-24 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48629?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48629.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46986
[https://github.com/apache/spark/pull/46986]

> Migrate the remaining code to structured logging framework
> --
>
> Key: SPARK-48629
> URL: https://issues.apache.org/jira/browse/SPARK-48629
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48545) Create to_avro and from_avro SQL functions to match PySpark equivalent

2024-06-21 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48545?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48545.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46977
[https://github.com/apache/spark/pull/46977]

> Create to_avro and from_avro SQL functions to match PySpark equivalent
> --
>
> Key: SPARK-48545
> URL: https://issues.apache.org/jira/browse/SPARK-48545
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Daniel
>Assignee: Daniel
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> The PySpark API is here: 
> https://github.com/apache/spark/blob/d5c33c6bfb5757b243fc8e1734daeaa4fe3b9b32/python/pyspark/sql/avro/functions.py#L35



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48545) Create to_avro and from_avro SQL functions to match PySpark equivalent

2024-06-21 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48545?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48545:
--

Assignee: Daniel

> Create to_avro and from_avro SQL functions to match PySpark equivalent
> --
>
> Key: SPARK-48545
> URL: https://issues.apache.org/jira/browse/SPARK-48545
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Daniel
>Assignee: Daniel
>Priority: Major
>  Labels: pull-request-available
>
> The PySpark API is here: 
> https://github.com/apache/spark/blob/d5c33c6bfb5757b243fc8e1734daeaa4fe3b9b32/python/pyspark/sql/avro/functions.py#L35



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48623) Structured Logging Framework Scala Style Migration

2024-06-19 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48623?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48623.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46980
[https://github.com/apache/spark/pull/46980]

> Structured Logging Framework Scala Style Migration
> --
>
> Key: SPARK-48623
> URL: https://issues.apache.org/jira/browse/SPARK-48623
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Amanda Liu
>Assignee: Amanda Liu
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48623) Structured Logging Framework Scala Style Migration

2024-06-19 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48623?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48623:
--

Assignee: Amanda Liu

> Structured Logging Framework Scala Style Migration
> --
>
> Key: SPARK-48623
> URL: https://issues.apache.org/jira/browse/SPARK-48623
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Amanda Liu
>Assignee: Amanda Liu
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47579) Spark core: Migrate logInfo with variables to structured logging framework

2024-06-17 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47579?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47579.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46951
[https://github.com/apache/spark/pull/46951]

> Spark core: Migrate logInfo with variables to structured logging framework
> --
>
> Key: SPARK-47579
> URL: https://issues.apache.org/jira/browse/SPARK-47579
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Anh Tuan Pham
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48611) Log TID for input split in HadoopRDD and NewHadoopRDD

2024-06-14 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48611?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48611:
--

Assignee: Cheng Pan

> Log TID for input split in HadoopRDD and NewHadoopRDD
> -
>
> Key: SPARK-48611
> URL: https://issues.apache.org/jira/browse/SPARK-48611
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48611) Log TID for input split in HadoopRDD and NewHadoopRDD

2024-06-14 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48611?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48611.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46966
[https://github.com/apache/spark/pull/46966]

> Log TID for input split in HadoopRDD and NewHadoopRDD
> -
>
> Key: SPARK-48611
> URL: https://issues.apache.org/jira/browse/SPARK-48611
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Cheng Pan
>Assignee: Cheng Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48626) Change the scope of object LogKeys as private in Spark

2024-06-13 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48626?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-48626:
---
Summary: Change the scope of object LogKeys as private in Spark  (was: 
Change the scope of object LogKyes as private in Spark)

> Change the scope of object LogKeys as private in Spark
> --
>
> Key: SPARK-48626
> URL: https://issues.apache.org/jira/browse/SPARK-48626
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48626) Change the scope of object LogKyes as private in Spark

2024-06-13 Thread Gengliang Wang (Jira)
Gengliang Wang created SPARK-48626:
--

 Summary: Change the scope of object LogKyes as private in Spark
 Key: SPARK-48626
 URL: https://issues.apache.org/jira/browse/SPARK-48626
 Project: Spark
  Issue Type: Sub-task
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Gengliang Wang
Assignee: Gengliang Wang






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-48545) Create from_avro SQL function to match PySpark equivalent

2024-06-05 Thread Gengliang Wang (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-48545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17852593#comment-17852593
 ] 

Gengliang Wang commented on SPARK-48545:


+1 for having such functions (from_avro and to_avro)

> Create from_avro SQL function to match PySpark equivalent
> -
>
> Key: SPARK-48545
> URL: https://issues.apache.org/jira/browse/SPARK-48545
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Daniel
>Priority: Major
>
> The PySpark API is here: 
> https://github.com/apache/spark/blob/d5c33c6bfb5757b243fc8e1734daeaa4fe3b9b32/python/pyspark/sql/avro/functions.py#L35



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47977) DateTimeUtils.timestampDiff and DateTimeUtils.timestampAdd should not throw INTERNAL_ERROR exception

2024-06-03 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47977?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-47977:
--

Assignee: Vitalii Li

> DateTimeUtils.timestampDiff and DateTimeUtils.timestampAdd should not throw 
> INTERNAL_ERROR exception
> 
>
> Key: SPARK-47977
> URL: https://issues.apache.org/jira/browse/SPARK-47977
> Project: Spark
>  Issue Type: Bug
>  Components: PySpark, SQL
>Affects Versions: 4.0.0
>Reporter: Vitalii Li
>Assignee: Vitalii Li
>Priority: Major
>  Labels: pull-request-available
>
> https://github.com/apache/spark/pull/44263 converted `IllegalStateException` 
> to `InternalError` when a unit for DateTimeUtils.timestampDiff and 
> DateTimeUtils.timestampAdd is not from a sanctioned list.
> Originally incorrect unit should have been caught in parser before an 
> expression is constructed. However, PySpark introduced PythonSQLUtils that 
> creates expressions without validation. Since then the `INTERNAL_ERROR` is 
> incorrect error class and should be converted to execution error with correct 
> error class instead.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47977) DateTimeUtils.timestampDiff and DateTimeUtils.timestampAdd should not throw INTERNAL_ERROR exception

2024-06-03 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47977?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47977.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46210
[https://github.com/apache/spark/pull/46210]

> DateTimeUtils.timestampDiff and DateTimeUtils.timestampAdd should not throw 
> INTERNAL_ERROR exception
> 
>
> Key: SPARK-47977
> URL: https://issues.apache.org/jira/browse/SPARK-47977
> Project: Spark
>  Issue Type: Bug
>  Components: PySpark, SQL
>Affects Versions: 4.0.0
>Reporter: Vitalii Li
>Assignee: Vitalii Li
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> https://github.com/apache/spark/pull/44263 converted `IllegalStateException` 
> to `InternalError` when a unit for DateTimeUtils.timestampDiff and 
> DateTimeUtils.timestampAdd is not from a sanctioned list.
> Originally incorrect unit should have been caught in parser before an 
> expression is constructed. However, PySpark introduced PythonSQLUtils that 
> creates expressions without validation. Since then the `INTERNAL_ERROR` is 
> incorrect error class and should be converted to execution error with correct 
> error class instead.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48488) Restore the original logic of methods `log[info|warning|error]` in `SparkSubmit`

2024-06-02 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48488?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48488:
--

Assignee: BingKun Pan

> Restore the original logic of methods `log[info|warning|error]` in 
> `SparkSubmit`
> 
>
> Key: SPARK-48488
> URL: https://issues.apache.org/jira/browse/SPARK-48488
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48488) Restore the original logic of methods `log[info|warning|error]` in `SparkSubmit`

2024-06-02 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48488?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48488.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46822
[https://github.com/apache/spark/pull/46822]

> Restore the original logic of methods `log[info|warning|error]` in 
> `SparkSubmit`
> 
>
> Key: SPARK-48488
> URL: https://issues.apache.org/jira/browse/SPARK-48488
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48490) Unescapes any literals for message of MessageWithContext

2024-05-31 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48490?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48490:
--

Assignee: BingKun Pan

> Unescapes any literals for message of MessageWithContext
> 
>
> Key: SPARK-48490
> URL: https://issues.apache.org/jira/browse/SPARK-48490
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48490) Unescapes any literals for message of MessageWithContext

2024-05-31 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48490?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48490.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46824
[https://github.com/apache/spark/pull/46824]

> Unescapes any literals for message of MessageWithContext
> 
>
> Key: SPARK-48490
> URL: https://issues.apache.org/jira/browse/SPARK-48490
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48320) Add external third-party ecosystem access guide to the doc

2024-05-25 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48320?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48320:
--

Assignee: BingKun Pan

> Add external third-party ecosystem access guide to the doc
> --
>
> Key: SPARK-48320
> URL: https://issues.apache.org/jira/browse/SPARK-48320
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48320) Add external third-party ecosystem access guide to the doc

2024-05-25 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48320?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48320.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46634
[https://github.com/apache/spark/pull/46634]

> Add external third-party ecosystem access guide to the doc
> --
>
> Key: SPARK-48320
> URL: https://issues.apache.org/jira/browse/SPARK-48320
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48294) Make nestedTypeMissingElementTypeError case insensitive

2024-05-17 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48294?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48294:
--

Assignee: Michael Zhang

> Make nestedTypeMissingElementTypeError case insensitive
> ---
>
> Key: SPARK-48294
> URL: https://issues.apache.org/jira/browse/SPARK-48294
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 3.5.0, 4.0.0, 3.5.1, 3.5.2
>Reporter: Michael Zhang
>Assignee: Michael Zhang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> When incorrectly declaring a complex data type using nested types (ARRAY, MAP 
> and STRUCT), the query fails with a match error rather than 
> `INCOMPLETE_TYPE_DEFINITION`. This is because the match is case sensitive,



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48294) Make nestedTypeMissingElementTypeError case insensitive

2024-05-17 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48294?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48294.

Fix Version/s: 3.5.2
   (was: 4.0.0)
   Resolution: Fixed

Issue resolved by pull request 46643
[https://github.com/apache/spark/pull/46643]

> Make nestedTypeMissingElementTypeError case insensitive
> ---
>
> Key: SPARK-48294
> URL: https://issues.apache.org/jira/browse/SPARK-48294
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 3.5.0, 4.0.0, 3.5.1, 3.5.2
>Reporter: Michael Zhang
>Assignee: Michael Zhang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.5.2
>
>
> When incorrectly declaring a complex data type using nested types (ARRAY, MAP 
> and STRUCT), the query fails with a match error rather than 
> `INCOMPLETE_TYPE_DEFINITION`. This is because the match is case sensitive,



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48303) Reorganize `LogKey`

2024-05-17 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48303?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48303.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46612
[https://github.com/apache/spark/pull/46612]

> Reorganize `LogKey`
> ---
>
> Key: SPARK-48303
> URL: https://issues.apache.org/jira/browse/SPARK-48303
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48303) Reorganize `LogKey`

2024-05-17 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48303?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48303:
--

Assignee: BingKun Pan

> Reorganize `LogKey`
> ---
>
> Key: SPARK-48303
> URL: https://issues.apache.org/jira/browse/SPARK-48303
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48214) Ban import `org.slf4j.Logger` & `org.slf4j.LoggerFactory`

2024-05-15 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48214?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48214:
--

Assignee: BingKun Pan

> Ban import `org.slf4j.Logger` & `org.slf4j.LoggerFactory`
> -
>
> Key: SPARK-48214
> URL: https://issues.apache.org/jira/browse/SPARK-48214
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48214) Ban import `org.slf4j.Logger` & `org.slf4j.LoggerFactory`

2024-05-15 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48214?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48214.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46502
[https://github.com/apache/spark/pull/46502]

> Ban import `org.slf4j.Logger` & `org.slf4j.LoggerFactory`
> -
>
> Key: SPARK-48214
> URL: https://issues.apache.org/jira/browse/SPARK-48214
> Project: Spark
>  Issue Type: Sub-task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48291) Rename Java Logger as SparkLogger

2024-05-15 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48291?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48291.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46600
[https://github.com/apache/spark/pull/46600]

> Rename Java Logger as SparkLogger 
> --
>
> Key: SPARK-48291
> URL: https://issues.apache.org/jira/browse/SPARK-48291
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Two new classes org.apache.spark.internal.Logger and 
> org.apache.spark.internal.LoggerFactory were introduced from 
> [https://github.com/apache/spark/pull/46301|https://github.com/apache/spark/pull/46301,]
> Given that {{Logger}} is a widely recognized interface in Log4j, it may lead 
> to confusion to have a class with the same name. To avoid this and clarify 
> its purpose within the Spark framework, I propose renaming 
> {{org.apache.spark.internal.Logger}} to 
> {{{}org.apache.spark.internal.SparkLogger{}}}. Similarly, to maintain 
> consistency, {{org.apache.spark.internal.LoggerFactory}} should be renamed to 
> {{{}org.apache.spark.internal.SparkLoggerFactory{}}}.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48291) Rename Java Logger as SparkLogger

2024-05-15 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48291?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-48291:
---
Summary: Rename Java Logger as SparkLogger   (was: Refactor Java Logger as 
SparkLogger )

> Rename Java Logger as SparkLogger 
> --
>
> Key: SPARK-48291
> URL: https://issues.apache.org/jira/browse/SPARK-48291
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>
> Two new classes org.apache.spark.internal.Logger and 
> org.apache.spark.internal.LoggerFactory were introduced from 
> [https://github.com/apache/spark/pull/46301|https://github.com/apache/spark/pull/46301,]
> Given that {{Logger}} is a widely recognized interface in Log4j, it may lead 
> to confusion to have a class with the same name. To avoid this and clarify 
> its purpose within the Spark framework, I propose renaming 
> {{org.apache.spark.internal.Logger}} to 
> {{{}org.apache.spark.internal.SparkLogger{}}}. Similarly, to maintain 
> consistency, {{org.apache.spark.internal.LoggerFactory}} should be renamed to 
> {{{}org.apache.spark.internal.SparkLoggerFactory{}}}.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48291) Refactor Java Logger as SparkLogger

2024-05-15 Thread Gengliang Wang (Jira)
Gengliang Wang created SPARK-48291:
--

 Summary: Refactor Java Logger as SparkLogger 
 Key: SPARK-48291
 URL: https://issues.apache.org/jira/browse/SPARK-48291
 Project: Spark
  Issue Type: Sub-task
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Gengliang Wang
Assignee: Gengliang Wang


Two new classes org.apache.spark.internal.Logger and 
org.apache.spark.internal.LoggerFactory were introduced from 
[https://github.com/apache/spark/pull/46301|https://github.com/apache/spark/pull/46301,]

Given that {{Logger}} is a widely recognized interface in Log4j, it may lead to 
confusion to have a class with the same name. To avoid this and clarify its 
purpose within the Spark framework, I propose renaming 
{{org.apache.spark.internal.Logger}} to 
{{{}org.apache.spark.internal.SparkLogger{}}}. Similarly, to maintain 
consistency, {{org.apache.spark.internal.LoggerFactory}} should be renamed to 
{{{}org.apache.spark.internal.SparkLoggerFactory{}}}.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47599) MLLib: Migrate logWarn with variables to structured logging framework

2024-05-14 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47599?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47599.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46527
[https://github.com/apache/spark/pull/46527]

> MLLib: Migrate logWarn with variables to structured logging framework
> -
>
> Key: SPARK-47599
> URL: https://issues.apache.org/jira/browse/SPARK-47599
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: BingKun Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48209) Common (java side): Migrate `error/warn/info` with variables to structured logging framework

2024-05-13 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48209?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48209:
--

Assignee: BingKun Pan

> Common (java side): Migrate `error/warn/info` with variables to structured 
> logging framework
> 
>
> Key: SPARK-48209
> URL: https://issues.apache.org/jira/browse/SPARK-48209
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48209) Common (java side): Migrate `error/warn/info` with variables to structured logging framework

2024-05-13 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48209?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48209.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46493
[https://github.com/apache/spark/pull/46493]

> Common (java side): Migrate `error/warn/info` with variables to structured 
> logging framework
> 
>
> Key: SPARK-48209
> URL: https://issues.apache.org/jira/browse/SPARK-48209
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48260) disable output committer coordination in one test of ParquetIOSuite

2024-05-13 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48260:
--

Assignee: Gengliang Wang

> disable output committer coordination in one test of ParquetIOSuite
> ---
>
> Key: SPARK-48260
> URL: https://issues.apache.org/jira/browse/SPARK-48260
> Project: Spark
>  Issue Type: Test
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Wenchen Fan
>Assignee: Gengliang Wang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48260) disable output committer coordination in one test of ParquetIOSuite

2024-05-13 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48260.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46562
[https://github.com/apache/spark/pull/46562]

> disable output committer coordination in one test of ParquetIOSuite
> ---
>
> Key: SPARK-48260
> URL: https://issues.apache.org/jira/browse/SPARK-48260
> Project: Spark
>  Issue Type: Test
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Wenchen Fan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-47599) MLLib: Migrate logWarn with variables to structured logging framework

2024-05-08 Thread Gengliang Wang (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-47599?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17844869#comment-17844869
 ] 

Gengliang Wang commented on SPARK-47599:


I checked with [~itholic] and he is not available for this one.

[~panbingkun] are you interested in this one?

> MLLib: Migrate logWarn with variables to structured logging framework
> -
>
> Key: SPARK-47599
> URL: https://issues.apache.org/jira/browse/SPARK-47599
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48182) SQL (java side): Migrate `error/warn/info` with variables to structured logging framework

2024-05-08 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48182?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48182.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46450
[https://github.com/apache/spark/pull/46450]

> SQL (java side): Migrate `error/warn/info` with variables to structured 
> logging framework
> -
>
> Key: SPARK-48182
> URL: https://issues.apache.org/jira/browse/SPARK-48182
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48182) SQL (java side): Migrate `error/warn/info` with variables to structured logging framework

2024-05-08 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48182?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48182:
--

Assignee: BingKun Pan

> SQL (java side): Migrate `error/warn/info` with variables to structured 
> logging framework
> -
>
> Key: SPARK-48182
> URL: https://issues.apache.org/jira/browse/SPARK-48182
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48126) Make spark.log.structuredLogging.enabled effective

2024-05-07 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48126?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48126.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46452
[https://github.com/apache/spark/pull/46452]

> Make spark.log.structuredLogging.enabled effective
> --
>
> Key: SPARK-48126
> URL: https://issues.apache.org/jira/browse/SPARK-48126
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Currently, the spark conf spark.log.structuredLogging.enabled is not taking 
> effects. We need to fix it.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48126) Make spark.log.structuredLogging.enabled effective

2024-05-07 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48126?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-48126:
---
Summary: Make spark.log.structuredLogging.enabled effective  (was: Make 
spark.log.structuredLogging.enabled effecitve)

> Make spark.log.structuredLogging.enabled effective
> --
>
> Key: SPARK-48126
> URL: https://issues.apache.org/jira/browse/SPARK-48126
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>
> Currently, the spark conf spark.log.structuredLogging.enabled is not taking 
> effects. We need to fix it.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48134) Spark core (java side): Migrate `error/warn/info` with variables to structured logging framework

2024-05-07 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48134.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46390
[https://github.com/apache/spark/pull/46390]

> Spark core (java side): Migrate `error/warn/info` with variables to 
> structured logging framework
> 
>
> Key: SPARK-48134
> URL: https://issues.apache.org/jira/browse/SPARK-48134
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48134) Spark core (java side): Migrate `error/warn/info` with variables to structured logging framework

2024-05-07 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48134:
--

Assignee: BingKun Pan

> Spark core (java side): Migrate `error/warn/info` with variables to 
> structured logging framework
> 
>
> Key: SPARK-48134
> URL: https://issues.apache.org/jira/browse/SPARK-48134
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48145) Remove logDebug and logTrace with MDC in java structured logging framework

2024-05-06 Thread Gengliang Wang (Jira)
Gengliang Wang created SPARK-48145:
--

 Summary: Remove logDebug and logTrace with MDC in java structured 
logging framework
 Key: SPARK-48145
 URL: https://issues.apache.org/jira/browse/SPARK-48145
 Project: Spark
  Issue Type: Sub-task
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Gengliang Wang
Assignee: Gengliang Wang






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48131) Unify MDC key `mdc.taskName` and `task_name`

2024-05-04 Thread Gengliang Wang (Jira)
Gengliang Wang created SPARK-48131:
--

 Summary: Unify MDC key `mdc.taskName` and `task_name`
 Key: SPARK-48131
 URL: https://issues.apache.org/jira/browse/SPARK-48131
 Project: Spark
  Issue Type: Sub-task
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Gengliang Wang
Assignee: Gengliang Wang


Rename the MDC key `mdc.taskName` as `task_name`, so that it is consistent with 
all the MDC keys used in the structured logging framework.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48129) Provide a constant table schema in PySpark for querying structured logs

2024-05-04 Thread Gengliang Wang (Jira)
Gengliang Wang created SPARK-48129:
--

 Summary: Provide a constant table schema in PySpark for querying 
structured logs
 Key: SPARK-48129
 URL: https://issues.apache.org/jira/browse/SPARK-48129
 Project: Spark
  Issue Type: Sub-task
  Components: PySpark
Affects Versions: 4.0.0
Reporter: Gengliang Wang
Assignee: Gengliang Wang






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48123) Provide a constant table schema for querying structured logs

2024-05-04 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48123?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48123.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46375
[https://github.com/apache/spark/pull/46375]

> Provide a constant table schema for querying structured logs
> 
>
> Key: SPARK-48123
> URL: https://issues.apache.org/jira/browse/SPARK-48123
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Providing a table schema LOG_SCHEMA, so that users can load structured logs 
> with the following:
> ```
> spark.read.schema(LOG_SCHEMA).json(logPath)
> ```



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47578) Spark core: Migrate logWarn with variables to structured logging framework

2024-05-04 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47578.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46309
[https://github.com/apache/spark/pull/46309]

> Spark core: Migrate logWarn with variables to structured logging framework
> --
>
> Key: SPARK-47578
> URL: https://issues.apache.org/jira/browse/SPARK-47578
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Daniel
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48124) Disable structured logging for Interpreter by default

2024-05-03 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48124?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-48124:
---
Description: 
Since there are plain text output from 
Interpreters(spark-shell/spark-sql/pyspark), it makes more sense to disable 
structured logging for Interpreters by default.

 

spark-shell output when with structured logging enabled:

```

Setting default log level to "WARN".

To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).

Welcome to

                    __

     / __/__  ___ _/ /__

    _\ \/ _ \/ _ `/ __/  '_/

   /___/ .__/\_,_/_/ /_/\_\   version 4.0.0-SNAPSHOT

      /_/

         

Using Scala version 2.13.13 (OpenJDK 64-Bit Server VM, Java 17.0.9)

Type in expressions to have them evaluated.

Type :help for more information.

{"ts":"2024-05-04T01:11:03.797Z","level":"WARN","msg":"Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable","logger":"NativeCodeLoader"}

{"ts":"2024-05-04T01:11:04.104Z","level":"WARN","msg":"Service 'SparkUI' could 
not bind on port 4040. Attempting port 4041.","logger":"Utils"}

Spark context Web UI available at http://10.10.114.155:4041

Spark context available as 'sc' (master = local[*], app id = 
local-1714785064155).

Spark session available as 'spark'.

```

 

spark-shell output when without structured logging enabled:

```

Setting default log level to "WARN".

To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).

Welcome to

                    __

     / __/__  ___ _/ /__

    _\ \/ _ \/ _ `/ __/  '_/

   /___/ .__/\_,_/_/ /_/\_\   version 4.0.0-SNAPSHOT

      /_/

         

Using Scala version 2.13.13 (OpenJDK 64-Bit Server VM, Java 17.0.9)

Type in expressions to have them evaluated.

Type :help for more information.

24/05/03 18:11:35 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable

24/05/03 18:11:35 WARN Utils: Service 'SparkUI' could not bind on port 4040. 
Attempting port 4041.

Spark context Web UI available at http://10.10.114.155:4041

Spark context available as 'sc' (master = local[*], app id = 
local-1714785095892).

Spark session available as 'spark'.

```

  was:
Since there are plain text output from 
Interpreters(spark-shell/spark-sql/pyspark), it makes more sense to disable 
structured logging for Interpreters by default.

 


> Disable structured logging for Interpreter by default
> -
>
> Key: SPARK-48124
> URL: https://issues.apache.org/jira/browse/SPARK-48124
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>
> Since there are plain text output from 
> Interpreters(spark-shell/spark-sql/pyspark), it makes more sense to disable 
> structured logging for Interpreters by default.
>  
> spark-shell output when with structured logging enabled:
> ```
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
> setLogLevel(newLevel).
> Welcome to
>                     __
>      / __/__  ___ _/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 4.0.0-SNAPSHOT
>       /_/
>          
> Using Scala version 2.13.13 (OpenJDK 64-Bit Server VM, Java 17.0.9)
> Type in expressions to have them evaluated.
> Type :help for more information.
> {"ts":"2024-05-04T01:11:03.797Z","level":"WARN","msg":"Unable to load 
> native-hadoop library for your platform... using builtin-java classes where 
> applicable","logger":"NativeCodeLoader"}
> {"ts":"2024-05-04T01:11:04.104Z","level":"WARN","msg":"Service 'SparkUI' 
> could not bind on port 4040. Attempting port 4041.","logger":"Utils"}
> Spark context Web UI available at http://10.10.114.155:4041
> Spark context available as 'sc' (master = local[*], app id = 
> local-1714785064155).
> Spark session available as 'spark'.
> ```
>  
> spark-shell output when without structured logging enabled:
> ```
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
> setLogLevel(newLevel).
> Welcome to
>                     __
>      / __/__  ___ _/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 4.0.0-SNAPSHOT
>       /_/
>          
> Using Scala version 2.13.13 (OpenJDK 64-Bit Server VM, Java 17.0.9)
> Type in expressions to have them evaluated.
> Type :help for more information.
> 24/05/03 18:11:35 WARN NativeCodeLoader: Unable to load native-hadoop library 
> for your platform... using builtin-java classes where applicable
> 24/05/03 18:11:35 WARN Utils: 

[jira] [Updated] (SPARK-48124) Disable structured logging for Interpreter by default

2024-05-03 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48124?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-48124:
---
Description: 
Since there are plain text output from 
Interpreters(spark-shell/spark-sql/pyspark), it makes more sense to disable 
structured logging for Interpreters by default.

 

> Disable structured logging for Interpreter by default
> -
>
> Key: SPARK-48124
> URL: https://issues.apache.org/jira/browse/SPARK-48124
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>
> Since there are plain text output from 
> Interpreters(spark-shell/spark-sql/pyspark), it makes more sense to disable 
> structured logging for Interpreters by default.
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48126) Make spark.log.structuredLogging.enabled effecitve

2024-05-03 Thread Gengliang Wang (Jira)
Gengliang Wang created SPARK-48126:
--

 Summary: Make spark.log.structuredLogging.enabled effecitve
 Key: SPARK-48126
 URL: https://issues.apache.org/jira/browse/SPARK-48126
 Project: Spark
  Issue Type: Sub-task
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Gengliang Wang
Assignee: Gengliang Wang


Currently, the spark conf spark.log.structuredLogging.enabled is not taking 
effects. We need to fix it.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48124) Disable structured logging for Interpreter by default

2024-05-03 Thread Gengliang Wang (Jira)
Gengliang Wang created SPARK-48124:
--

 Summary: Disable structured logging for Interpreter by default
 Key: SPARK-48124
 URL: https://issues.apache.org/jira/browse/SPARK-48124
 Project: Spark
  Issue Type: Sub-task
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Gengliang Wang
Assignee: Gengliang Wang






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48059) Implement the structured log framework on the java side

2024-05-03 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48059?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48059:
--

Assignee: BingKun Pan

> Implement the structured log framework on the java side
> ---
>
> Key: SPARK-48059
> URL: https://issues.apache.org/jira/browse/SPARK-48059
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48059) Implement the structured log framework on the java side

2024-05-03 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48059?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48059.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46301
[https://github.com/apache/spark/pull/46301]

> Implement the structured log framework on the java side
> ---
>
> Key: SPARK-48059
> URL: https://issues.apache.org/jira/browse/SPARK-48059
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48123) Provide a constant table schema for querying structured logs

2024-05-03 Thread Gengliang Wang (Jira)
Gengliang Wang created SPARK-48123:
--

 Summary: Provide a constant table schema for querying structured 
logs
 Key: SPARK-48123
 URL: https://issues.apache.org/jira/browse/SPARK-48123
 Project: Spark
  Issue Type: Sub-task
  Components: SQL
Affects Versions: 4.0.0
Reporter: Gengliang Wang
Assignee: Gengliang Wang


Providing a table schema LOG_SCHEMA, so that users can load structured logs 
with the following:

```

spark.read.schema(LOG_SCHEMA).json(logPath)

```



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-47671) Enable structured logging in log4j2.properties.template and update `configuration.md`

2024-05-02 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47671?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-47671:
---
Description: 
# rename the current log4j2.properties.template as 
log4j2.properties.pattern-layout-template
 # Enable structured logging in log4j2.properties.template
 # Update `configuration.md` on how to configure logging

  was:
# rename the current log4j2.properties.template as 
log4j2-pattern-layout.properties.template 
 # Enable structured logging in log4j2.properties.template
 # Update `configuration.md` on how to configure logging


> Enable structured logging in log4j2.properties.template and update 
> `configuration.md`
> -
>
> Key: SPARK-47671
> URL: https://issues.apache.org/jira/browse/SPARK-47671
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Priority: Major
>
> # rename the current log4j2.properties.template as 
> log4j2.properties.pattern-layout-template
>  # Enable structured logging in log4j2.properties.template
>  # Update `configuration.md` on how to configure logging



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48067) Fix Variant default columns for more complex default variants

2024-05-02 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48067?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48067.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46312
[https://github.com/apache/spark/pull/46312]

> Fix Variant default columns for more complex default variants
> -
>
> Key: SPARK-48067
> URL: https://issues.apache.org/jira/browse/SPARK-48067
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Richard Chen
>Assignee: Richard Chen
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Default columns are stored as structfield metadata (string -> string) map. 
> This means the literal values are stored as strings.
> However, the string representation of a variant is the JSONified string. 
> Thus, we need to wrap the string with `parse_json` to correctly use the 
> default values



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-48067) Fix Variant default columns for more complex default variants

2024-05-02 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48067?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-48067:
--

Assignee: Richard Chen

> Fix Variant default columns for more complex default variants
> -
>
> Key: SPARK-48067
> URL: https://issues.apache.org/jira/browse/SPARK-48067
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Richard Chen
>Assignee: Richard Chen
>Priority: Major
>  Labels: pull-request-available
>
> Default columns are stored as structfield metadata (string -> string) map. 
> This means the literal values are stored as strings.
> However, the string representation of a variant is the JSONified string. 
> Thus, we need to wrap the string with `parse_json` to correctly use the 
> default values



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47585) SQL core: Migrate logInfo with variables to structured logging framework

2024-04-29 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47585?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47585.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46264
[https://github.com/apache/spark/pull/46264]

> SQL core: Migrate logInfo with variables to structured logging framework
> 
>
> Key: SPARK-47585
> URL: https://issues.apache.org/jira/browse/SPARK-47585
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: BingKun Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48016) Fix a bug in try_divide function when with decimals

2024-04-29 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48016?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-48016:
---
Fix Version/s: 4.0.0
   3.5.2

> Fix a bug in try_divide function when with decimals
> ---
>
> Key: SPARK-48016
> URL: https://issues.apache.org/jira/browse/SPARK-48016
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0, 3.5.2
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.2, 3.4.4
>
>
> Binary Arithmetic operators should include the evalMode during makeCopy. 
> Otherwise, the following query will throw DIVIDE_BY_ZERO error instead of 
> returning null
>  
> {code:java}
> SELECT try_divide(1, decimal(0)); {code}
> This is caused from the rule DecimalPrecision:
> {code:java}
> case b @ BinaryOperator(left, right) if left.dataType != right.dataType =>
>   (left, right) match {
>  ...
> case (l: Literal, r) if r.dataType.isInstanceOf[DecimalType] &&
> l.dataType.isInstanceOf[IntegralType] &&
> literalPickMinimumPrecision =>
>   b.makeCopy(Array(Cast(l, DataTypeUtils.fromLiteral(l)), r)) {code}
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-48016) Fix a bug in try_divide function when with decimals

2024-04-29 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48016?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-48016.

Fix Version/s: 3.4.4
   Resolution: Fixed

Issue resolved by pull request 46289
[https://github.com/apache/spark/pull/46289]

> Fix a bug in try_divide function when with decimals
> ---
>
> Key: SPARK-48016
> URL: https://issues.apache.org/jira/browse/SPARK-48016
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0, 3.5.2
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.4
>
>
> Binary Arithmetic operators should include the evalMode during makeCopy. 
> Otherwise, the following query will throw DIVIDE_BY_ZERO error instead of 
> returning null
>  
> {code:java}
> SELECT try_divide(1, decimal(0)); {code}
> This is caused from the rule DecimalPrecision:
> {code:java}
> case b @ BinaryOperator(left, right) if left.dataType != right.dataType =>
>   (left, right) match {
>  ...
> case (l: Literal, r) if r.dataType.isInstanceOf[DecimalType] &&
> l.dataType.isInstanceOf[IntegralType] &&
> literalPickMinimumPrecision =>
>   b.makeCopy(Array(Cast(l, DataTypeUtils.fromLiteral(l)), r)) {code}
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-48016) Fix a bug in try_divide function when with decimals

2024-04-29 Thread Gengliang Wang (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-48016?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17842193#comment-17842193
 ] 

Gengliang Wang commented on SPARK-48016:


[~dongjoon] sure, I just moved it.

> Fix a bug in try_divide function when with decimals
> ---
>
> Key: SPARK-48016
> URL: https://issues.apache.org/jira/browse/SPARK-48016
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0, 3.5.2
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>  Labels: pull-request-available
>
> Binary Arithmetic operators should include the evalMode during makeCopy. 
> Otherwise, the following query will throw DIVIDE_BY_ZERO error instead of 
> returning null
>  
> {code:java}
> SELECT try_divide(1, decimal(0)); {code}
> This is caused from the rule DecimalPrecision:
> {code:java}
> case b @ BinaryOperator(left, right) if left.dataType != right.dataType =>
>   (left, right) match {
>  ...
> case (l: Literal, r) if r.dataType.isInstanceOf[DecimalType] &&
> l.dataType.isInstanceOf[IntegralType] &&
> literalPickMinimumPrecision =>
>   b.makeCopy(Array(Cast(l, DataTypeUtils.fromLiteral(l)), r)) {code}
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48016) Fix a bug in try_divide function when with decimals

2024-04-29 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48016?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-48016:
---
Parent Issue: SPARK-44111  (was: SPARK-35161)

> Fix a bug in try_divide function when with decimals
> ---
>
> Key: SPARK-48016
> URL: https://issues.apache.org/jira/browse/SPARK-48016
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0, 3.5.2
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>  Labels: pull-request-available
>
> Binary Arithmetic operators should include the evalMode during makeCopy. 
> Otherwise, the following query will throw DIVIDE_BY_ZERO error instead of 
> returning null
>  
> {code:java}
> SELECT try_divide(1, decimal(0)); {code}
> This is caused from the rule DecimalPrecision:
> {code:java}
> case b @ BinaryOperator(left, right) if left.dataType != right.dataType =>
>   (left, right) match {
>  ...
> case (l: Literal, r) if r.dataType.isInstanceOf[DecimalType] &&
> l.dataType.isInstanceOf[IntegralType] &&
> literalPickMinimumPrecision =>
>   b.makeCopy(Array(Cast(l, DataTypeUtils.fromLiteral(l)), r)) {code}
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48035) try_add/try_multiply should not be semantic equal to add/multiply

2024-04-28 Thread Gengliang Wang (Jira)
Gengliang Wang created SPARK-48035:
--

 Summary: try_add/try_multiply should not be semantic equal to 
add/multiply
 Key: SPARK-48035
 URL: https://issues.apache.org/jira/browse/SPARK-48035
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 4.0.0
Reporter: Gengliang Wang


In the current implementation, the following code will return true
{code:java}
val l1 = Literal(1)
val l2 = Literal(2)
val l3 = Literal(3)
val expr1 = Add(Add(l1, l2), l3)
val expr2 = Add(Add(l2, l1, EvalMode.TRY), l3)
expr1.semanticEquals(expr2) {code}
The same applies to Multiply.

When creating MultiCommutativeOp for Add/Multiply, we should ensure all the 
evalMode are consistent.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-48016) Binary Arithmetic operators should include the evalMode when makeCopy

2024-04-26 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-48016?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-48016:
---
Summary: Binary Arithmetic operators should include the evalMode when 
makeCopy  (was: Binary Arithmetic operators should include the evalMode during 
makeCopy)

> Binary Arithmetic operators should include the evalMode when makeCopy
> -
>
> Key: SPARK-48016
> URL: https://issues.apache.org/jira/browse/SPARK-48016
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0, 3.5.2
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>
> Binary Arithmetic operators should include the evalMode during makeCopy. 
> Otherwise, the following query will throw DIVIDE_BY_ZERO error instead of 
> returning null
>  
> {code:java}
> SELECT try_divide(1, decimal(0)); {code}
> This is caused from the rule DecimalPrecision:
> {code:java}
> case b @ BinaryOperator(left, right) if left.dataType != right.dataType =>
>   (left, right) match {
>  ...
> case (l: Literal, r) if r.dataType.isInstanceOf[DecimalType] &&
> l.dataType.isInstanceOf[IntegralType] &&
> literalPickMinimumPrecision =>
>   b.makeCopy(Array(Cast(l, DataTypeUtils.fromLiteral(l)), r)) {code}
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48016) Binary Arithmetic operators should include the evalMode during makeCopy

2024-04-26 Thread Gengliang Wang (Jira)
Gengliang Wang created SPARK-48016:
--

 Summary: Binary Arithmetic operators should include the evalMode 
during makeCopy
 Key: SPARK-48016
 URL: https://issues.apache.org/jira/browse/SPARK-48016
 Project: Spark
  Issue Type: Sub-task
  Components: SQL
Affects Versions: 4.0.0, 3.5.2
Reporter: Gengliang Wang
Assignee: Gengliang Wang


Binary Arithmetic operators should include the evalMode during makeCopy. 
Otherwise, the following query will throw DIVIDE_BY_ZERO error instead of 
returning null

 
{code:java}
SELECT try_divide(1, decimal(0)); {code}
This is caused from the rule DecimalPrecision:
{code:java}
case b @ BinaryOperator(left, right) if left.dataType != right.dataType =>
  (left, right) match {
 ...
case (l: Literal, r) if r.dataType.isInstanceOf[DecimalType] &&
l.dataType.isInstanceOf[IntegralType] &&
literalPickMinimumPrecision =>
  b.makeCopy(Array(Cast(l, DataTypeUtils.fromLiteral(l)), r)) {code}
 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47696) try_to_timestamp should handle SparkUpgradeException

2024-04-26 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47696?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47696.

Resolution: Won't Fix

> try_to_timestamp should handle SparkUpgradeException
> 
>
> Key: SPARK-47696
> URL: https://issues.apache.org/jira/browse/SPARK-47696
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 4.0.0, 3.5.2, 3.4.3
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>  Labels: pull-request-available
>
> Currently, try_to_timestamp will throw an exception on legacy timestamp input.
> {code:java}
> > SELECT try_to_timestamp('2016-12-1', '-MM-dd')
> org.apache.spark.SparkUpgradeException: 
> [INCONSISTENT_BEHAVIOR_CROSS_VERSION.PARSE_DATETIME_BY_NEW_PARSER] You may 
> get a different result due to the upgrading to Spark >= 3.0:
> Fail to parse '2016-12-1' in the new parser.
> You can set "spark.sql.legacy.timeParserPolicy" to "LEGACY" to restore the 
> behavior before Spark 3.0, or set to "CORRECTED" and treat it as an invalid 
> datetime string. SQLSTATE: 42K0B {code}
> It should return null instead of error.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-48011) Store LogKey name as a value to avoid generating new string instances

2024-04-26 Thread Gengliang Wang (Jira)
Gengliang Wang created SPARK-48011:
--

 Summary: Store LogKey name as a value to avoid generating new 
string instances
 Key: SPARK-48011
 URL: https://issues.apache.org/jira/browse/SPARK-48011
 Project: Spark
  Issue Type: Sub-task
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Gengliang Wang
Assignee: Gengliang Wang






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47963) Make the external Spark ecosystem can use structured logging mechanisms

2024-04-26 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47963?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47963.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46193
[https://github.com/apache/spark/pull/46193]

> Make the external Spark ecosystem can use structured logging mechanisms 
> 
>
> Key: SPARK-47963
> URL: https://issues.apache.org/jira/browse/SPARK-47963
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47583) SQL core: Migrate logError with variables to structured logging framework

2024-04-24 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47583?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47583.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45969
[https://github.com/apache/spark/pull/45969]

> SQL core: Migrate logError with variables to structured logging framework
> -
>
> Key: SPARK-47583
> URL: https://issues.apache.org/jira/browse/SPARK-47583
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Daniel
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47604) Resource managers: Migrate logInfo with variables to structured logging framework

2024-04-23 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47604?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47604.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46130
[https://github.com/apache/spark/pull/46130]

> Resource managers: Migrate logInfo with variables to structured logging 
> framework
> -
>
> Key: SPARK-47604
> URL: https://issues.apache.org/jira/browse/SPARK-47604
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: BingKun Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47600) MLLib: Migrate logInfo with variables to structured logging framework

2024-04-22 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47600.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46151
[https://github.com/apache/spark/pull/46151]

> MLLib: Migrate logInfo with variables to structured logging framework
> -
>
> Key: SPARK-47600
> URL: https://issues.apache.org/jira/browse/SPARK-47600
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47907) Put removal of '!' as a synonym for 'NOT' on a keyword level under a config

2024-04-22 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47907?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-47907:
--

Assignee: Serge Rielau

> Put removal of '!' as a synonym for 'NOT' on a keyword level under a config
> ---
>
> Key: SPARK-47907
> URL: https://issues.apache.org/jira/browse/SPARK-47907
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Serge Rielau
>Assignee: Serge Rielau
>Priority: Major
>  Labels: pull-request-available
>
> Recently we dissolved the lexer equivalence between '!' and 'NOT'.
> ! is a prefix operator and a synonym for NOT only in that case.
> But NOT is used in many more cases in the grammar.
> Given that there are a handful of known scenearios where users have exploited 
> the undocumented loophole it's best to add a config.
> Usage found so far is:
> `c1 ! IN(1, 2)`
> `c1 ! BETWEEN 1 AND 2`
> `c1 ! LIKE 'a%'`
>  But there are worse cases:
> c1 IS ! NULL
> CREATE TABLE T(c1 INT ! NULL)
> or even
> CREATE TABLE IF ! EXISTS T(c1 INT)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47907) Put removal of '!' as a synonym for 'NOT' on a keyword level under a config

2024-04-22 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47907?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47907.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46138
[https://github.com/apache/spark/pull/46138]

> Put removal of '!' as a synonym for 'NOT' on a keyword level under a config
> ---
>
> Key: SPARK-47907
> URL: https://issues.apache.org/jira/browse/SPARK-47907
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Serge Rielau
>Assignee: Serge Rielau
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Recently we dissolved the lexer equivalence between '!' and 'NOT'.
> ! is a prefix operator and a synonym for NOT only in that case.
> But NOT is used in many more cases in the grammar.
> Given that there are a handful of known scenearios where users have exploited 
> the undocumented loophole it's best to add a config.
> Usage found so far is:
> `c1 ! IN(1, 2)`
> `c1 ! BETWEEN 1 AND 2`
> `c1 ! LIKE 'a%'`
>  But there are worse cases:
> c1 IS ! NULL
> CREATE TABLE T(c1 INT ! NULL)
> or even
> CREATE TABLE IF ! EXISTS T(c1 INT)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47596) Streaming: Migrate logWarn with variables to structured logging framework

2024-04-18 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47596?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47596.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46079
[https://github.com/apache/spark/pull/46079]

> Streaming: Migrate logWarn with variables to structured logging framework
> -
>
> Key: SPARK-47596
> URL: https://issues.apache.org/jira/browse/SPARK-47596
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: BingKun Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47591) Hive-thriftserver: Migrate logInfo with variables to structured logging framework

2024-04-17 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47591?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47591.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45926
[https://github.com/apache/spark/pull/45926]

> Hive-thriftserver: Migrate logInfo with variables to structured logging 
> framework
> -
>
> Key: SPARK-47591
> URL: https://issues.apache.org/jira/browse/SPARK-47591
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Haejoon Lee
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47584) SQL core: Migrate logWarn with variables to structured logging framework

2024-04-17 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47584?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47584.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46057
[https://github.com/apache/spark/pull/46057]

> SQL core: Migrate logWarn with variables to structured logging framework
> 
>
> Key: SPARK-47584
> URL: https://issues.apache.org/jira/browse/SPARK-47584
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: BingKun Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47627) MERGE with WITH SCHEMA EVOLUTION keywords

2024-04-17 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47627.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45748
[https://github.com/apache/spark/pull/45748]

> MERGE with WITH SCHEMA EVOLUTION keywords
> -
>
> Key: SPARK-47627
> URL: https://issues.apache.org/jira/browse/SPARK-47627
> Project: Spark
>  Issue Type: New Feature
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Pengfei Xu
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47588) Hive module: Migrate logInfo with variables to structured logging framework

2024-04-16 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47588?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47588.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46086
[https://github.com/apache/spark/pull/46086]

> Hive module: Migrate logInfo with variables to structured logging framework
> ---
>
> Key: SPARK-47588
> URL: https://issues.apache.org/jira/browse/SPARK-47588
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Gengliang Wang
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47590) Hive-thriftserver: Migrate logWarn with variables to structured logging framework

2024-04-16 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47590?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47590.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45923
[https://github.com/apache/spark/pull/45923]

> Hive-thriftserver: Migrate logWarn with variables to structured logging 
> framework
> -
>
> Key: SPARK-47590
> URL: https://issues.apache.org/jira/browse/SPARK-47590
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Haejoon Lee
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47594) Connector module: Migrate logInfo with variables to structured logging framework

2024-04-16 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47594?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-47594:
--

Assignee: BingKun Pan

> Connector module: Migrate logInfo with variables to structured logging 
> framework
> 
>
> Key: SPARK-47594
> URL: https://issues.apache.org/jira/browse/SPARK-47594
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: BingKun Pan
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47594) Connector module: Migrate logInfo with variables to structured logging framework

2024-04-16 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47594?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47594.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 46022
[https://github.com/apache/spark/pull/46022]

> Connector module: Migrate logInfo with variables to structured logging 
> framework
> 
>
> Key: SPARK-47594
> URL: https://issues.apache.org/jira/browse/SPARK-47594
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: BingKun Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47804) Add Dataframe cache debug log

2024-04-15 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47804?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47804.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45990
[https://github.com/apache/spark/pull/45990]

> Add Dataframe cache debug log
> -
>
> Key: SPARK-47804
> URL: https://issues.apache.org/jira/browse/SPARK-47804
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Xinyi Yu
>Assignee: Xinyi Yu
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Add debug log for dataframe cache.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47603) Resource managers: Migrate logWarn with variables to structured logging framework

2024-04-15 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47603?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47603.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45957
[https://github.com/apache/spark/pull/45957]

> Resource managers: Migrate logWarn with variables to structured logging 
> framework
> -
>
> Key: SPARK-47603
> URL: https://issues.apache.org/jira/browse/SPARK-47603
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: BingKun Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47792) Make the value of MDC can support `null`

2024-04-11 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47792?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47792.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45975
[https://github.com/apache/spark/pull/45975]

> Make the value of MDC can support `null`
> 
>
> Key: SPARK-47792
> URL: https://issues.apache.org/jira/browse/SPARK-47792
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47587) Hive module: Migrate logWarn with variables to structured logging framework

2024-04-10 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47587?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47587.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45927
[https://github.com/apache/spark/pull/45927]

> Hive module: Migrate logWarn with variables to structured logging framework
> ---
>
> Key: SPARK-47587
> URL: https://issues.apache.org/jira/browse/SPARK-47587
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: BingKun Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47601) Graphx: Migrate logs with variables to structured logging framework

2024-04-10 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47601.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45947
[https://github.com/apache/spark/pull/45947]

> Graphx:  Migrate logs with variables to structured logging framework
> 
>
> Key: SPARK-47601
> URL: https://issues.apache.org/jira/browse/SPARK-47601
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Haejoon Lee
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47601) Graphx: Migrate logs with variables to structured logging framework

2024-04-10 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-47601:
--

Assignee: Haejoon Lee

> Graphx:  Migrate logs with variables to structured logging framework
> 
>
> Key: SPARK-47601
> URL: https://issues.apache.org/jira/browse/SPARK-47601
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Haejoon Lee
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47595) Streaming: Migrate logError with variables to structured logging framework

2024-04-10 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47595?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47595.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45910
[https://github.com/apache/spark/pull/45910]

> Streaming: Migrate logError with variables to structured logging framework
> --
>
> Key: SPARK-47595
> URL: https://issues.apache.org/jira/browse/SPARK-47595
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: BingKun Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47593) Connector module: Migrate logWarn with variables to structured logging framework

2024-04-09 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47593.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45879
[https://github.com/apache/spark/pull/45879]

> Connector module: Migrate logWarn with variables to structured logging 
> framework
> 
>
> Key: SPARK-47593
> URL: https://issues.apache.org/jira/browse/SPARK-47593
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: BingKun Pan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-47586) Hive module: Migrate logError with variables to structured logging framework

2024-04-09 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47586?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang resolved SPARK-47586.

Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 45876
[https://github.com/apache/spark/pull/45876]

> Hive module: Migrate logError with variables to structured logging framework
> 
>
> Key: SPARK-47586
> URL: https://issues.apache.org/jira/browse/SPARK-47586
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Gengliang Wang
>Assignee: Haejoon Lee
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-47783) Refresh error-states.json

2024-04-09 Thread Gengliang Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-47783?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-47783:
--

Assignee: Serge Rielau

> Refresh error-states.json
> -
>
> Key: SPARK-47783
> URL: https://issues.apache.org/jira/browse/SPARK-47783
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Serge Rielau
>Assignee: Serge Rielau
>Priority: Major
>  Labels: pull-request-available
>
> We want to add more SQLSTATEs to the menu to prevent collisions and do some 
> general cleanup



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



  1   2   3   4   5   6   7   8   9   10   >