[GitHub] [spark] yaooqinn commented on pull request #28650: [SPARK-31830][SQL] Consistent error handling for datetime formatting and parsing functions

2020-05-29 Thread GitBox


yaooqinn commented on pull request #28650:
URL: https://github.com/apache/spark/pull/28650#issuecomment-635783265


   OK



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] yaooqinn commented on pull request #28650: [SPARK-31830][SQL] Consistent error handling for datetime formatting and parsing functions

2020-05-28 Thread GitBox


yaooqinn commented on pull request #28650:
URL: https://github.com/apache/spark/pull/28650#issuecomment-635449894


   But maybe we should apply `SparkUpgradeException ` to `format()` as 
`parse()` for better error msg for end-users



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] yaooqinn commented on pull request #28650: [SPARK-31830][SQL] Consistent error handling for datetime formatting and parsing functions

2020-05-28 Thread GitBox


yaooqinn commented on pull request #28650:
URL: https://github.com/apache/spark/pull/28650#issuecomment-635448724


   No the pattern `yyy-MM-dd` is valid for both version of formatters, 
but calling the `format()` throws an exception in the new one but silently 
suppressed in FromUnixTime expression. Now we are not suppressing the 
exceptions in this PR,
   
   it will do the same as `date_format`
   ```
   spark-sql>  select from_unixtime(1, 'yyy-MM-dd'); -- this is before
   NULL
   spark-sql>  select date_format('now', 'yyy-MM-dd'); -- this will be 
after
   20/05/29 00:14:47 ERROR SparkSQLDriver: Failed in [ select 
date_format('now', 'yyy-MM-dd')]
   java.lang.ArrayIndexOutOfBoundsException: 11
at 
java.time.format.DateTimeFormatterBuilder$NumberPrinterParser.format(DateTimeFormatterBuilder.java:2568)
at 
java.time.format.DateTimeFormatterBuilder$CompositePrinterParser.format(DateTimeFormatterBuilder.java:2190)
   
   ```



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] yaooqinn commented on pull request #28650: [SPARK-31830][SQL] Consistent error handling for datetime formatting and parsing functions

2020-05-28 Thread GitBox


yaooqinn commented on pull request #28650:
URL: https://github.com/apache/spark/pull/28650#issuecomment-635436944


   > ```
   > spark-sql> select from_unixtime(1, 'yyy-MM-dd');
   > NULL
   > ```
   > 
   > Why we don't throw `SparkUpgradeException` in this case?
   
   that logic belongs to `parse()` only 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org