Hi,

was wondering if we have something like that takes as an argument a spark
df type e.g DecimalType(12,5) and converts it into the corresponding hive
schema type. Double / Decimal / String ?

Any ideas.

Thanks,

Reply via email to