Hello everyone,

I am new to Pyspark and i would like to ask if there is any way to have a
Dataframe column which is ArrayType and have a different DataType for each
elemnt of the ArrayType. For example
to have something like :

StructType([StructField("Column_Name", ArrayType(ArrayType(FloatType(),
FloatType(), DecimalType(), False),False), False)]).

I want to have an ArrayType column with 2 elements as FloatType and 1
element as DecimalType

Thank you in advance

Reply via email to