MaxGekk opened a new pull request, #45140:
URL: https://github.com/apache/spark/pull/45140

   ### What changes were proposed in this pull request?
   In the PR, I propose to add one more field to keys of `supportedFormat` in 
`IntervalUtils` because current implementation has duplicate keys that 
overwrites each other. For instance, the following keys are the same:
   ```
   (YM.YEAR, YM.MONTH)
   ...
   (DT.DAY, DT.HOUR)
   ```
   because `YM.YEAR = DT.DAY = 0` and `YM.MONTH = DT.HOUR = 1`.
   
   This is a backport of https://github.com/apache/spark/pull/45127.
   
   ### Why are the changes needed?
   To fix the incorrect error message when Spark cannot parse ANSI interval 
string. For example, the expected format should be some year-month format but 
Spark outputs day-time one:
   ```sql
   spark-sql (default)> select interval '-\t2-2\t' year to month;
   
   Interval string does not match year-month format of `[+|-]d h`, `INTERVAL 
[+|-]'[+|-]d h' DAY TO HOUR` when cast to interval year to month: -        2-2  
   . (line 1, pos 16)
   
   == SQL ==
   select interval '-\t2-2\t' year to month
   ----------------^^^
   ```
   
   ### Does this PR introduce _any_ user-facing change?
   Yes.
   
   ### How was this patch tested?
   By running the existing test suite:
   ```
   $ build/sbt "test:testOnly *IntervalUtilsSuite"
   ```
   and regenerating the golden files:
   ```
   $ SPARK_GENERATE_GOLDEN_FILES=1 PYSPARK_PYTHON=python3 build/sbt 
"sql/testOnly org.apache.spark.sql.SQLQueryTestSuite"
   ```
   
   ### Was this patch authored or co-authored using generative AI tooling?
   No.
   
   Authored-by: Max Gekk <max.g...@gmail.com>
   (cherry picked from commit 074fcf2807000d342831379de0fafc1e49a6bf19)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to