[ 
https://issues.apache.org/jira/browse/SPARK-36503?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-36503:
------------------------------------

    Assignee:     (was: Apache Spark)

> Add RowToColumnConverter for BinaryType
> ---------------------------------------
>
>                 Key: SPARK-36503
>                 URL: https://issues.apache.org/jira/browse/SPARK-36503
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.3.0
>            Reporter: Huaxin Gao
>            Priority: Minor
>
> currently, we have RowToColumnConverter for all data types except BinaryType
> {code:java}
>   private def getConverterForType(dataType: DataType, nullable: Boolean): 
> TypeConverter = {
>     val core = dataType match {
>       case BooleanType => BooleanConverter
>       case ByteType => ByteConverter
>       case ShortType => ShortConverter
>       case IntegerType | DateType => IntConverter
>       case FloatType => FloatConverter
>       case LongType | TimestampType => LongConverter
>       case DoubleType => DoubleConverter
>       case StringType => StringConverter
>       case CalendarIntervalType => CalendarConverter
>       case at: ArrayType => 
> ArrayConverter(getConverterForType(at.elementType, at.containsNull))
>       case st: StructType => new StructConverter(st.fields.map(
>         (f) => getConverterForType(f.dataType, f.nullable)))
>       case dt: DecimalType => new DecimalConverter(dt)
>       case mt: MapType => MapConverter(getConverterForType(mt.keyType, 
> nullable = false),
>         getConverterForType(mt.valueType, mt.valueContainsNull))
>       case unknown => throw 
> QueryExecutionErrors.unsupportedDataTypeError(unknown.toString)
>     }
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to