[ 
https://issues.apache.org/jira/browse/SPARK-31695?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Saravanan Raju updated SPARK-31695:
-----------------------------------
    Description: 
I was trying to convert json column to map. I tried udf for converting json to 
map. but it is not working as expected.
  
{code:java}
val df1 = Seq(("{\"k\":10.004}")).toDF("json")
def udfJsonStrToMapDecimal = udf((jsonStr: String)=> { var 
jsonMap:Map[String,Any] = parse(jsonStr).values.asInstanceOf[Map[String, Any]]
     jsonMap.map{case(k,v) => 
(k,BigDecimal.decimal(v.asInstanceOf[Double]).setScale(6))}.toMap
})
val f = df1.withColumn("map",udfJsonStrToMapDecimal($"json"))
scala> f.printSchema
root
 |-- json: string (nullable = true)
 |-- map: map (nullable = true)
 |    |-- key: string
 |    |-- value: decimal(38,18) (valueContainsNull = true)
{code}
 

*instead of decimal(38,6) it converting the value as decimal(38,18)*

  was:
0
I was trying to convert json column to map. I tried udf for converting json to 
map. but it is not working as expected.
 val df1 = Seq(("\{\"k\":10.004}")).toDF("json")
def udfJsonStrToMapDecimal = udf((jsonStr: String)=> \{ var 
jsonMap:Map[String,Any] = parse(jsonStr).values.asInstanceOf[Map[String, Any]]
     jsonMap.map{case(k,v) => 
(k,BigDecimal.decimal(v.asInstanceOf[Double]).setScale(6))}.toMap
})
val f = df1.withColumn("map",udfJsonStrToMapDecimal($"json"))
scala> f.printSchema
root
 |-- json: string (nullable = true)
 |-- map: map (nullable = true)
 |    |-- key: string
 |    |-- value: decimal(38,18) (valueContainsNull = true){{}}
*instead of decimal(38,6) it converting the value as decimal(38,18)* 


> BigDecimal setScale is not working in Spark UDF
> -----------------------------------------------
>
>                 Key: SPARK-31695
>                 URL: https://issues.apache.org/jira/browse/SPARK-31695
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, SQL
>    Affects Versions: 2.3.4
>            Reporter: Saravanan Raju
>            Priority: Major
>
> I was trying to convert json column to map. I tried udf for converting json 
> to map. but it is not working as expected.
>   
> {code:java}
> val df1 = Seq(("{\"k\":10.004}")).toDF("json")
> def udfJsonStrToMapDecimal = udf((jsonStr: String)=> { var 
> jsonMap:Map[String,Any] = parse(jsonStr).values.asInstanceOf[Map[String, Any]]
>      jsonMap.map{case(k,v) => 
> (k,BigDecimal.decimal(v.asInstanceOf[Double]).setScale(6))}.toMap
> })
> val f = df1.withColumn("map",udfJsonStrToMapDecimal($"json"))
> scala> f.printSchema
> root
>  |-- json: string (nullable = true)
>  |-- map: map (nullable = true)
>  |    |-- key: string
>  |    |-- value: decimal(38,18) (valueContainsNull = true)
> {code}
>  
> *instead of decimal(38,6) it converting the value as decimal(38,18)*



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to