[ 
https://issues.apache.org/jira/browse/SPARK-50610?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17906659#comment-17906659
 ] 

zuotingbing edited comment on SPARK-50610 at 12/19/24 12:49 AM:
----------------------------------------------------------------

!image-2024-12-19-08-45-13-688.png!

 

the precision of the decimal column defined in my table is decimal(18,10), but 
in javaHiveDecimalObjectInspector the decimal precision is (38,18), here we 
lost the Decimal precision and scale,right?

 


was (Author: zuo.tingbing9):
!image-2024-12-18-17-01-08-790.png!

 

spark lost the Decimal precision and scale,right?

 

> we should fix the decimal precision on function toInspector(dataType: 
> DataType) in HiveInspectors class
> -------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-50610
>                 URL: https://issues.apache.org/jira/browse/SPARK-50610
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.4.4, 3.5.3
>            Reporter: zuotingbing
>            Priority: Major
>         Attachments: image-2024-12-18-17-01-08-790.png, 
> image-2024-12-19-08-45-13-688.png
>
>
> when we add some mask UDF on Decimal(18,10) type coloumn in hive-sdf moudle, 
> the scale of the mask result value should be 10, but now is 18.
> In JavaHiveDecimalObjectInspector class the default scale of HiveDecimal is 
> 18, so we need to change to the correct decimal precision on function 
> toInspector(dataType: DataType) in HiveInspectors class.
>  
> Can somebody fix it?



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to