[ https://issues.apache.org/jira/browse/SPARK-31937?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17163160#comment-17163160 ]
angerszhu commented on SPARK-31937: ----------------------------------- raise a pr soon {code:java} -- SPARK-31937 transform with defined row format delimit --SELECT TRANSFORM(a, b, c, d, e, null) --ROW FORMAT DELIMITED --FIELDS TERMINATED BY '|' --COLLECTION ITEMS TERMINATED BY '&' --MAP KEYS TERMINATED BY '*' --LINES TERMINATED BY '\n' --NULL DEFINED AS 'NULL' --USING 'cat' AS (a, b, c, d, e, f) --ROW FORMAT DELIMITED --FIELDS TERMINATED BY '|' --COLLECTION ITEMS TERMINATED BY '&' --MAP KEYS TERMINATED BY '*' --LINES TERMINATED BY '\n' --NULL DEFINED AS 'NULL' --FROM VALUEW (1, 1.23, array(1,, 2, 3), map(1, '1'), struct(1, '1')) t(a, b, c, d, e); -- --SELECT TRANSFORM(a, b, c, d, e, null) --ROW FORMAT DELIMITED --FIELDS TERMINATED BY '|' --COLLECTION ITEMS TERMINATED BY '&' --MAP KEYS TERMINATED BY '*' --LINES TERMINATED BY '\n' --NULL DEFINED AS 'NULL' --USING 'cat' AS (a) --ROW FORMAT DELIMITED --FIELDS TERMINATED BY '||' --LINES TERMINATED BY '\n' --NULL DEFINED AS 'NULL' --FROM VALUEW (1, 1.23, array(1,, 2, 3), map(1, '1'), struct(1, '1')) t(a, b, c, d, e); {code} > Support processing array and map type using spark noserde mode > -------------------------------------------------------------- > > Key: SPARK-31937 > URL: https://issues.apache.org/jira/browse/SPARK-31937 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 3.0.0 > Reporter: angerszhu > Priority: Major > > Currently, It is not supported to use script(e.g. python) to process array > type or map type, it will complain with below message: > {{org.apache.spark.sql.catalyst.expressions.UnsafeArrayData cannot be cast to > [Ljava.lang.Object}} > {{org.apache.spark.sql.catalyst.expressions.UnsafeMapData cannot be cast to > java.util.Map}} > To support it -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org