[ https://issues.apache.org/jira/browse/SPARK-32021?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17141015#comment-17141015 ]
Apache Spark commented on SPARK-32021: -------------------------------------- User 'MaxGekk' has created a pull request for this issue: https://github.com/apache/spark/pull/28878 > make_interval does not accept seconds >100 > ------------------------------------------ > > Key: SPARK-32021 > URL: https://issues.apache.org/jira/browse/SPARK-32021 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.0.0 > Reporter: Juliusz Sompolski > Assignee: Maxim Gekk > Priority: Major > Fix For: 3.1.0 > > > In make_interval(years, months, weeks, days, hours, mins, secs), secs are > defined as Decimal(8, 6), which turns into null if the value of the > expression overflows 100 seconds. > Larger seconds values should be allowed. > This has been reported by Simba, who wants to use make_interval to implement > translation for TIMESTAMP_ADD ODBC function in Spark 3.0. > ODBC {fn TIMESTAMPADD(SECOND, integer_exp, timestamp} fails when integer_exp > returns seconds values >= 100. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org