[ https://issues.apache.org/jira/browse/SPARK-26964?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16774767#comment-16774767 ]
Hyukjin Kwon commented on SPARK-26964: -------------------------------------- Can you describe the usecase in more details? Adding primitive types there requires considerable amount of codes to maintain. I want to see how much it's worth. > to_json/from_json do not match JSON spec due to not supporting scalars > ---------------------------------------------------------------------- > > Key: SPARK-26964 > URL: https://issues.apache.org/jira/browse/SPARK-26964 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.3.2, 2.4.0 > Reporter: Huon Wilson > Priority: Major > > Spark SQL's {{to_json}} and {{from_json}} currently support arrays and > objects, but not the scalar/primitive types. This doesn't match the JSON spec > on https://www.json.org/ or [RFC8259|https://tools.ietf.org/html/rfc8259]: a > JSON document ({{json: element}}) consists of a value surrounded by > whitespace ({{element: ws value ws}}), where a value is an object or array > _or_ a number or string etc.: > {code:none} > value > object > array > string > number > "true" > "false" > "null" > {code} > Having {{to_json}} and {{from_json}} support scalars would make them flexible > enough for a library I'm working on, where an arbitrary (user-supplied) > column needs to be turned into JSON. > NB. these newer specs differ to the original [RFC4627| > https://tools.ietf.org/html/rfc4627] (which is now obsolete) that > (essentially) had {{value: object | array}}. > This is related to SPARK-24391 and SPARK-25252, which added support for > arrays of scalars. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org