[ 
https://issues.apache.org/jira/browse/SPARK-32132?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17148848#comment-17148848
 ] 

Juliusz Sompolski commented on SPARK-32132:
-------------------------------------------

Also 2.4 adds "interval" at the start, while 3.0 does not. E.g. "interval 3 
days" in 2.4 and "3 days" in 3.0.
I actually think that the new 3.0 results are better / more standard, and I 
haven't heard about anyone complaining that it broke the way they parse it.

Edit: [~cloud_fan] posting now the above comment that I thought I posted 
yesterday, but it stayed open and not send in an open tab. It causes some 
issues with unit tests, but I think it shouldn't cause real world problems, and 
in any case the new format is likely better for the future. Thanks for 
explaining.

> Thriftserver interval returns "4 weeks 2 days" in 2.4 and "30 days" in 3.0
> --------------------------------------------------------------------------
>
>                 Key: SPARK-32132
>                 URL: https://issues.apache.org/jira/browse/SPARK-32132
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Juliusz Sompolski
>            Priority: Minor
>
> In https://github.com/apache/spark/pull/26418, a setting 
> spark.sql.dialect.intervalOutputStyle was implemented, to control interval 
> output style. This PR also removed "toString" from CalendarInterval. This 
> change got reverted in https://github.com/apache/spark/pull/27304, and the 
> CalendarInterval.toString got implemented back in 
> https://github.com/apache/spark/pull/26572.
> But it behaves differently now: In 2.4 "4 weeks 2 days" are returned, and 3.0 
> returns "30 days".
> Thriftserver uses HiveResults.toHiveString, which uses 
> CalendarInterval.toString to return interval results as string.  The results 
> are now different in 3.0



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to