[ 
https://issues.apache.org/jira/browse/SPARK-43814?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ke Jia updated SPARK-43814:
---------------------------
    Summary: Spark cannot construct the DecimalType in 
CatalystTypeConverters.convertToCatalyst() API  (was: Spark cannot use the 
df.collect() result to construct the DecimalType in 
CatalystTypeConverters.convertToCatalyst() API)

> Spark cannot construct the DecimalType in 
> CatalystTypeConverters.convertToCatalyst() API
> ----------------------------------------------------------------------------------------
>
>                 Key: SPARK-43814
>                 URL: https://issues.apache.org/jira/browse/SPARK-43814
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.2.2, 3.3.2
>            Reporter: Ke Jia
>            Priority: Major
>
>  
> When using the df.collect() result to construct the DecimalType in 
> CatalystTypeConverters.convertToCatalyst()
>  
> {code:java}
> Decimal scale (18) cannot be greater than precision (1).
> org.apache.spark.sql.AnalysisException: Decimal scale (18) cannot be greater 
> than precision (1).
>         at 
> org.apache.spark.sql.errors.QueryCompilationErrors$.decimalCannotGreaterThanPrecisionError(QueryCompilationErrors.scala:1671)
>         at org.apache.spark.sql.types.DecimalType.<init>(DecimalType.scala:48)
>         at 
> org.apache.spark.sql.catalyst.CatalystTypeConverters$.convertToCatalyst(CatalystTypeConverters.scala:518)
>         at 
> org.apache.spark.sql.DataFrameFunctionsSuite.$anonfun$new$712(DataFrameFunctionsSuite.scala:3714){code}
>  
>  
> This issue can be reproduced by the following case:
>  
> {code:java}
>   val expression = Literal.default(DecimalType.SYSTEM_DEFAULT)
>   val schema = StructType(
>     StructField("a", IntegerType, nullable = true) :: Nil)
>   val empData = Seq(Row(1))
>   val df = spark.createDataFrame(spark.sparkContext.parallelize(empData), 
> schema)
>   val resultDF = df.select(Column(expression))
>   val result = resultDF.collect().head.get(0)
>   CatalystTypeConverters.convertToCatalyst(result)
> {code}
>  
> It seems that the reason for the failure is that the value of precision is 
> not set when the Decimal.toJavaBigDecimal() method is called. However, Java 
> BigDecimal only provides an interface for modifying scale and does not 
> provide an interface for modifying Precision.
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to