cloud-fan commented on code in PR #37915: URL: https://github.com/apache/spark/pull/37915#discussion_r977125557
########## sql/catalyst/src/main/scala/org/apache/spark/sql/types/Decimal.scala: ########## @@ -28,52 +28,35 @@ import org.apache.spark.sql.internal.SQLConf import org.apache.spark.unsafe.types.UTF8String /** - * A mutable implementation of BigDecimal that can hold a Long if values are small enough. - * - * The semantics of the fields are as follows: - * - _precision and _scale represent the SQL precision and scale we are looking for - * - If decimalVal is set, it represents the whole decimal value - * - Otherwise, the decimal value is longVal / (10 ** _scale) - * - * Note, for values between -1.0 and 1.0, precision digits are only counted after dot. + * A mutable implementation of BigDecimal that hold a `DecimalOperation`. */ @Unstable -final class Decimal extends Ordered[Decimal] with Serializable { +final class Decimal(initEnabled: Boolean = true) extends Ordered[Decimal] with Serializable { import org.apache.spark.sql.types.Decimal._ - private var decimalVal: BigDecimal = null - private var longVal: Long = 0L - private var _precision: Int = 1 - private var _scale: Int = 0 + private var decimalOperation: DecimalOperation[_] = null Review Comment: does it mean `decimalOperation` can be null sometimes? This seems fragile. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org