Hello,

I am facing below issue in pyspark code:

We are running spark code using dataproc serverless batch in google cloud
platform. Spark code is causing issue while writing the data to bigquery
table. In bigquery table , few of the columns have datatype as bignumeric
and spark code is changing the datatype from bignumeric to numeric while
writing the data. We need datatype to be kept as bignumeric only as we need
data of 38,20 precision.


Can we cast a column to bignumeric in spark sql dataframe like below code
for decimal:


df= spark.sql("""SELECT cast(col1 as decimal(38,20)) as col1 from table1""")

Spark version :3.3

Pyspark version : 1.1


Regards,

Nidhi

Reply via email to