liukun4515 commented on issue #3520:
URL: 
https://github.com/apache/arrow-datafusion/issues/3520#issuecomment-1252207917

   > My proposals are based on years of Oracle and Postgres use though, I have 
no Spark experience. What other thoughts and opinions are out there? How does 
Spark behave in these cases?
   
   Like cast, if we convert a value to another type which is overflow, the 
default result is NULL.
   
   For the mathematical operations, we should add the option for that.
   
   I think the two behavior is ok, but we should make them consistent.
   
   @kmitchener 
   
   In the spark, If we don't set the special parameter, spark will not throw 
the error, and just return the wrapping value.
   
   You can try it.
   
   the doc ref in the spark: 
https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html
   
https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html#arithmetic-operations
   
   cc @alamb 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to