In addition to below
Can we cast a column to bignumeric in spark sql dataframe like below code
for decimal:

df= spark.sql("""SELECT cast(col1 as decimal(38,20)) as col1 from table1""")

Regards,
Nidhi

On Wed, 22 Feb 2023 at 9:21 PM, nidhi kher <kherni...@gmail.com> wrote:

> Hello,
>
> I am facing below issue in spark code /
> We are running spark code using dataproc serverless batch in google cloud
> platform. Spark code is causing issue while writing the data to bigquery
> table. In bigquery table , few of the columns have datatype as bignumeric
> and spark code is changing the datatype from bignumeric to numeric while
> writing the data. We need datatype to be kept as bignumeric only as we need
> data of 38,20 precision.
>
> Please suggest.
>
> Regards,
> Nidhi Kher
>

Reply via email to