Hi Experts

I am trying to convert a string with decimal value to decimal in Spark Sql
and load it into Hive/Sql Server.

In Hive instead of getting converted to decimal all my values are coming as
null.

In Sql Server instead of getting decimal values are coming without precision

Can you please let me know if this is any kind of limitation

Here is my code


//select the required columns from actual data frame
val query ="""select eventId,
cast(eventData.latitude as Decimal(10,10)) as Latitude,
cast(eventData.longitude as Decimal(10,10)) as Longitude from event"""

//creating event data frame
val eventTableDF = sparkSession.sql(query)
//printing the schema for debugging purpose
eventTableDF.printSchema()

root
 |-- eventId: string (nullable = true)
 |-- Latitude: decimal(10,10) (nullable = true)
 |-- Longitude: decimal(10,10) (nullable = true)



 val eventTableDF = sparkSession.sql(query)
  import sparkSession.implicits._

eventTableDF.write.mode(org.apache.spark.sql.SaveMode.Append).insertInto(eventTable)





With Best Regards
Arnav Kumar

Reply via email to