MaxGekk commented on a change in pull request #25210: [SPARK-28432][SQL] Add `make_date` function URL: https://github.com/apache/spark/pull/25210#discussion_r305606571
########## File path: sql/core/src/test/resources/sql-tests/results/pgSQL/date.sql.out ########## @@ -508,8 +508,48 @@ struct<Days From 2K:int> -- !query 47 -DROP TABLE DATE_TBL +select make_date(2013, 7, 15) -- !query 47 schema -struct<> +struct<make_date(2013, 7, 15):date> -- !query 47 output +2013-07-15 + + +-- !query 48 +select make_date(-44, 3, 15) +-- !query 48 schema +struct<make_date(-44, 3, 15):date> +-- !query 48 output +0045-03-15 Review comment: The year `-44` is out of valid range according to SQL standard. We are getting `45` instead of `-44` while converting to java.sql.Date. If you switch to Java 8 API for date/timestamps: ```Scala scala> spark.conf.set("spark.sql.datetime.java8API.enabled", true) scala> spark.sql("select make_date(-44, 3, 15)").collect res7: Array[org.apache.spark.sql.Row] = Array([-0044-03-15]) ``` the returned instance of `java.time.LocalDate` seems reasonable. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org