[jira] [Commented] (SPARK-41299) OOM when filter pushdown `last_day` function

2022-11-29 Thread Hyukjin Kwon (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-41299?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17641048#comment-17641048
 ] 

Hyukjin Kwon commented on SPARK-41299:
--

Would be great if we have a minimal reproducer

> OOM when filter pushdown `last_day` function
> 
>
> Key: SPARK-41299
> URL: https://issues.apache.org/jira/browse/SPARK-41299
> Project: Spark
>  Issue Type: Bug
>  Components: Optimizer
>Affects Versions: 3.3.1
> Environment: Spark 3.3.1
> JDK 8 (openjdk version "1.8.0_352")
>Reporter: André F.
>Priority: Major
>
> Using the following transformation on Spark 3.3.1:
> {code:java}
> df.where($"date" === last_day($"date")) {code}
> Where `df` is a dataframe created from a set Parquet files.  I'm trying to 
> filter dates where they match with the last day of the month of where `date` 
> happened.
> Executors are dying with the following error:
> {code:java}
> java.lang.OutOfMemoryError: GC overhead limit exceeded
> at java.util.regex.Pattern.compile(Pattern.java:1722) ~[?:1.8.0_252]
> at java.util.regex.Pattern.(Pattern.java:1352) ~[?:1.8.0_252]
> at java.util.regex.Pattern.compile(Pattern.java:1028) ~[?:1.8.0_252] {code}
> By *disabling* the predicate pushdown rule, the job works normally.
>  Also, this works normally on Spark 3.3.0. I also couldn't verify other date 
> functions failing on the same way.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-41299) OOM when filter pushdown `last_day` function

2022-11-29 Thread Jira


[ 
https://issues.apache.org/jira/browse/SPARK-41299?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17640479#comment-17640479
 ] 

André F. commented on SPARK-41299:
--

The OOM happens before I can see the query plan on the UI. Is there another way 
to obtain it?

> OOM when filter pushdown `last_day` function
> 
>
> Key: SPARK-41299
> URL: https://issues.apache.org/jira/browse/SPARK-41299
> Project: Spark
>  Issue Type: Bug
>  Components: Optimizer
>Affects Versions: 3.3.1
> Environment: Spark 3.3.1
> JDK 8 (openjdk version "1.8.0_352")
>Reporter: André F.
>Priority: Major
>
> Using the following transformation on Spark 3.3.1:
> {code:java}
> df.where($"date" === last_day($"date")) {code}
> Where `df` is a dataframe created from a set Parquet files.  I'm trying to 
> filter dates where they match with the last day of the month of where `date` 
> happened.
> Executors are dying with the following error:
> {code:java}
> java.lang.OutOfMemoryError: GC overhead limit exceeded
> at java.util.regex.Pattern.compile(Pattern.java:1722) ~[?:1.8.0_252]
> at java.util.regex.Pattern.(Pattern.java:1352) ~[?:1.8.0_252]
> at java.util.regex.Pattern.compile(Pattern.java:1028) ~[?:1.8.0_252] {code}
> By *disabling* the predicate pushdown rule, the job works normally.
>  Also, this works normally on Spark 3.3.0. I also couldn't verify other date 
> functions failing on the same way.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-41299) OOM when filter pushdown `last_day` function

2022-11-28 Thread Yuming Wang (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-41299?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17640347#comment-17640347
 ] 

Yuming Wang commented on SPARK-41299:
-

Do you have the query plan?

> OOM when filter pushdown `last_day` function
> 
>
> Key: SPARK-41299
> URL: https://issues.apache.org/jira/browse/SPARK-41299
> Project: Spark
>  Issue Type: Bug
>  Components: Optimizer
>Affects Versions: 3.3.1
> Environment: Spark 3.3.1
> JDK 8 (openjdk version "1.8.0_352")
>Reporter: André F.
>Priority: Major
>
> Using the following transformation on Spark 3.3.1:
> {code:java}
> df.where($"date" === last_day($"date")) {code}
> Where `df` is a dataframe created from a set Parquet files.  I'm trying to 
> filter dates where they match with the last day of the month of where `date` 
> happened.
> Executors are dying with the following error:
> {code:java}
> java.lang.OutOfMemoryError: GC overhead limit exceeded
> at java.util.regex.Pattern.compile(Pattern.java:1722) ~[?:1.8.0_252]
> at java.util.regex.Pattern.(Pattern.java:1352) ~[?:1.8.0_252]
> at java.util.regex.Pattern.compile(Pattern.java:1028) ~[?:1.8.0_252] {code}
> By *disabling* the predicate pushdown rule, the job works normally.
>  Also, this works normally on Spark 3.3.0. I also couldn't verify other date 
> functions failing on the same way.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org