[jira] [Commented] (SPARK-35852) Improve the implementation for DateType +/- DayTimeIntervalType(DAY)
[ https://issues.apache.org/jira/browse/SPARK-35852?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17367801#comment-17367801 ] Apache Spark commented on SPARK-35852: -- User 'Peng-Lei' has created a pull request for this issue: https://github.com/apache/spark/pull/33033 > Improve the implementation for DateType +/- DayTimeIntervalType(DAY) > > > Key: SPARK-35852 > URL: https://issues.apache.org/jira/browse/SPARK-35852 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.2.0 >Reporter: PengLei >Priority: Major > Fix For: 3.2.0 > > > At now, `DateType +/- DayTimeIntervalType()` will convert the DateType to > TimestampType, then TimeAdd. When interval type is DayTimeIntervalType(DAY), > it can use DateAdd instead of TimeAdd. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-35852) Improve the implementation for DateType +/- DayTimeIntervalType(DAY)
[ https://issues.apache.org/jira/browse/SPARK-35852?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17367800#comment-17367800 ] Apache Spark commented on SPARK-35852: -- User 'Peng-Lei' has created a pull request for this issue: https://github.com/apache/spark/pull/33033 > Improve the implementation for DateType +/- DayTimeIntervalType(DAY) > > > Key: SPARK-35852 > URL: https://issues.apache.org/jira/browse/SPARK-35852 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.2.0 >Reporter: PengLei >Priority: Major > Fix For: 3.2.0 > > > At now, `DateType +/- DayTimeIntervalType()` will convert the DateType to > TimestampType, then TimeAdd. When interval type is DayTimeIntervalType(DAY), > it can use DateAdd instead of TimeAdd. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-35852) Improve the implementation for DateType +/- DayTimeIntervalType(DAY)
[ https://issues.apache.org/jira/browse/SPARK-35852?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17367332#comment-17367332 ] Apache Spark commented on SPARK-35852: -- User 'Peng-Lei' has created a pull request for this issue: https://github.com/apache/spark/pull/33024 > Improve the implementation for DateType +/- DayTimeIntervalType(DAY) > > > Key: SPARK-35852 > URL: https://issues.apache.org/jira/browse/SPARK-35852 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.2.0 >Reporter: PengLei >Priority: Major > Fix For: 3.2.0 > > > At now, `DateType +/- DayTimeIntervalType()` will convert the DateType to > TimestampType, then TimeAdd. When interval type is DayTimeIntervalType(DAY), > it can use DateAdd instead of TimeAdd. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-35852) Improve the implementation for DateType +/- DayTimeIntervalType(DAY)
[ https://issues.apache.org/jira/browse/SPARK-35852?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17366990#comment-17366990 ] PengLei commented on SPARK-35852: - I am working on this > Improve the implementation for DateType +/- DayTimeIntervalType(DAY) > > > Key: SPARK-35852 > URL: https://issues.apache.org/jira/browse/SPARK-35852 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.2.0 >Reporter: PengLei >Priority: Major > Fix For: 3.2.0 > > > At now, `DateType +/- DayTimeIntervalType()` will convert the DateType to > TimestampType, then TimeAdd. When interval type is DayTimeIntervalType(DAY), > it can use DateAdd instead of TimeAdd. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org