[GitHub] spark pull request #22494: [SPARK-22036][SQL][followup] add a new config for...

2018-09-21 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/spark/pull/22494#discussion_r219492885
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
@@ -1345,6 +1345,16 @@ object SQLConf {
   .booleanConf
   .createWithDefault(true)
 
+  val LITERAL_PRECISE_PRECISION =
--- End diff --

mmh... PRECISE_PRECISION sounds weird... can we look for a better one? I 
don't have a suggestion right now, but I'll think about it.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22494: [SPARK-22036][SQL][followup] add a new config for...

2018-09-21 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/spark/pull/22494#discussion_r219493091
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
@@ -1345,6 +1345,16 @@ object SQLConf {
   .booleanConf
   .createWithDefault(true)
 
+  val LITERAL_PRECISE_PRECISION =
+buildConf("spark.sql.literal.precisePrecision")
+  .internal()
+  .doc("When integral literals are used with decimals in binary 
operators, Spark will " +
+"pick a precise precision for the literals to calculate the 
precision and scale " +
--- End diff --

`a precise precision` -> `the minimal precision required to represent the 
given value`?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22494: [SPARK-22036][SQL][followup] add a new config for...

2018-09-21 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/spark/pull/22494#discussion_r219493420
  
--- Diff: sql/core/src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala 
---
@@ -2858,6 +2858,13 @@ class SQLQuerySuite extends QueryTest with 
SharedSQLContext {
 val result = ds.flatMap(_.bar).distinct
 result.rdd.isEmpty
   }
+
+  test("SPARK-25454: decimal division with negative scale") {
+// TODO: completely fix this issue even LITERAL_PRECISE_PRECISION is 
true.
--- End diff --

nit: `even when`


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22494: [SPARK-22036][SQL][followup] add a new config for...

2018-09-21 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/spark/pull/22494#discussion_r219492191
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
@@ -1345,6 +1345,16 @@ object SQLConf {
   .booleanConf
   .createWithDefault(true)
 
+  val LITERAL_PRECISE_PRECISION =
+buildConf("spark.sql.literal.precisePrecision")
+  .internal()
+  .doc("When integral literals are used with decimals in binary 
operators, Spark will " +
+"pick a precise precision for the literals to calculate the 
precision and scale " +
+"of the result decimal, when this config is true. By picking a 
precise precision, we " +
+"can avoid wasting precision, to reduce the possibility of 
overflow.")
--- End diff --

`to reduce the possibility of overflow` actually this is not true and 
depends on the value of `DECIMAL_OPERATIONS_ALLOW_PREC_LOSS`, If 
`DECIMAL_OPERATIONS_ALLOW_PREC_LOSS` is true, the risk is to have a precision 
loss, but we don't overflow. If that is false, then this statement is right.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22494: [SPARK-22036][SQL][followup] add a new config for...

2018-09-21 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/spark/pull/22494#discussion_r219505527
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
@@ -1345,6 +1345,15 @@ object SQLConf {
   .booleanConf
   .createWithDefault(true)
 
+  val LITERAL_PICK_MINIMUM_PRECISION =
+buildConf("spark.sql.literal.pickMinimumPrecision")
--- End diff --

nit: I am thinking whether we can consider this as a `legacy` config. We 
can also remove it once the PR for SPARK-25454 will be merged and/or we remove 
the support to negative scale decimals. What do you think?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org