I'm not sure if there is a way around this just looking for advice,
I create a dataframe from some decimals with a specific precision and
scale, then when I look at the dataframe it has defaulted the precision
and scale back again.
Is there a way to retain the precision and scale when doing a
Looks like what you observed is due to the following code in Decimal.scala :
def set(decimal: BigDecimal, precision: Int, scale: Int): Decimal = {
this.decimalVal = decimal.setScale(scale, ROUND_HALF_UP)
require(
decimalVal.precision <= precision,
s"Decimal precision