cloud-fan commented on code in PR #46338:
URL: https://github.com/apache/spark/pull/46338#discussion_r1590326222


##########
common/variant/src/main/java/org/apache/spark/types/variant/VariantUtil.java:
##########
@@ -392,21 +392,32 @@ public static double getDouble(byte[] value, int pos) {
     return Double.longBitsToDouble(readLong(value, pos + 1, 8));
   }
 
+  // Check whether the precision and scale of the decimal are within the limit.
+  private static void checkDecimal(BigDecimal d, int maxPrecision) {
+    if (d.precision() > maxPrecision || d.scale() > maxPrecision) {
+      throw malformedVariant();
+    }
+  }
+
   // Get a decimal value from variant value `value[pos...]`.
   // Throw `MALFORMED_VARIANT` if the variant is malformed.
   public static BigDecimal getDecimal(byte[] value, int pos) {
     checkIndex(pos, value.length);
     int basicType = value[pos] & BASIC_TYPE_MASK;
     int typeInfo = (value[pos] >> BASIC_TYPE_BITS) & TYPE_INFO_MASK;
     if (basicType != PRIMITIVE) throw unexpectedType(Type.DECIMAL);
-    int scale = value[pos + 1];
+    // Interpret the scale byte as unsigned. If it is a negative byte, the 
unsigned value must be
+    // greater than `MAX_DECIMAL16_PRECISION` and will trigger an error in 
`checkDecimal`.

Review Comment:
   We should distinguish between variant decimal spec and Spark decimal 
limitations. Some databases (and Java libraries) support negative decimal 
scale, which means the value range of `decimal(precision, 0)` * `(10^-scale)`. 
To read such decimals, e.g. `decimal(10, -5)`, we can turn it into Spark 
decimal `decimal(15, 0)`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to