[GitHub] [spark] HyukjinKwon commented on a change in pull request #28600: [SPARK-31761][SQL] cast integer to Long to avoid IntegerOverflow for IntegralDivide operator

2020-05-21 Thread GitBox


HyukjinKwon commented on a change in pull request #28600:
URL: https://github.com/apache/spark/pull/28600#discussion_r429034529



##
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala
##
@@ -337,11 +340,19 @@ trait DivModLike extends BinaryArithmetic {
 } else {
   s"${eval2.value} == 0"
 }
-val javaType = CodeGenerator.javaType(dataType)
+val isIntegralDiv = this.isInstanceOf[IntegralDivide]
+// From SPARK-16323 IntegralDivision returns Long data type
+val javaType = if (isIntegralDiv) JAVA_LONG else 
CodeGenerator.javaType(dataType)
+val operandJavaType = if (isIntegralDiv) operandsDataType match {
+  case _: IntegerType => JAVA_LONG

Review comment:
   Okay, allowing it seems making sense. It will also keep the 
compatibility.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] HyukjinKwon commented on a change in pull request #28600: [SPARK-31761][SQL] cast integer to Long to avoid IntegerOverflow for IntegralDivide operator

2020-05-21 Thread GitBox


HyukjinKwon commented on a change in pull request #28600:
URL: https://github.com/apache/spark/pull/28600#discussion_r428991924



##
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala
##
@@ -337,11 +340,19 @@ trait DivModLike extends BinaryArithmetic {
 } else {
   s"${eval2.value} == 0"
 }
-val javaType = CodeGenerator.javaType(dataType)
+val isIntegralDiv = this.isInstanceOf[IntegralDivide]
+// From SPARK-16323 IntegralDivision returns Long data type
+val javaType = if (isIntegralDiv) JAVA_LONG else 
CodeGenerator.javaType(dataType)
+val operandJavaType = if (isIntegralDiv) operandsDataType match {
+  case _: IntegerType => JAVA_LONG

Review comment:
   Yeah, let's handle it there if other DBMSes allow.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] HyukjinKwon commented on a change in pull request #28600: [SPARK-31761][SQL] cast integer to Long to avoid IntegerOverflow for IntegralDivide operator

2020-05-21 Thread GitBox


HyukjinKwon commented on a change in pull request #28600:
URL: https://github.com/apache/spark/pull/28600#discussion_r428991160



##
File path: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ArithmeticExpressionSuite.scala
##
@@ -505,4 +505,9 @@ class ArithmeticExpressionSuite extends SparkFunSuite with 
ExpressionEvalHelper
   checkEvaluation(e6, 0.toByte)
 }
   }
+
+  test("SPARK-31761: test integer overflow for (Divide) integral type ") {
+checkEvaluation(IntegralDivide(Literal(Integer.MIN_VALUE), Literal(-1)), 
Integer
+  .MIN_VALUE.toLong * -1)

Review comment:
   Does it overflow in other DBMSes? If they do, I think we should just let 
it overflow and guide by ANSI configuration. see also 
https://github.com/apache/spark/commit/ee41001949af43d25dc6962ab6ca277e53c64299





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org