Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21671#discussion_r199342464
  
    --- Diff: python/pyspark/sql/functions.py ---
    @@ -2163,9 +2163,9 @@ def json_tuple(col, *fields):
     @since(2.1)
     def from_json(col, schema, options={}):
         """
    -    Parses a column containing a JSON string into a :class:`MapType` with 
:class:`StringType`
    -    as keys type, :class:`StructType` or :class:`ArrayType` of 
:class:`StructType`\\s with
    -    the specified schema. Returns `null`, in the case of an unparseable 
string.
    +    Parses a column containing a JSON string into a :class:`MapType`, 
:class:`StructType`
    +    or :class:`ArrayType` of :class:`StructType`\\s with the specified 
schema. Returns
    +    `null`, in the case of an unparseable string.
    --- End diff --
    
    For date and timestamp, are't they also supported in this way (converted to 
string) now? This patch adds date and timestamp support by specifying the types 
as key in map type, so for other types, I am thinking the same way to support 
them as key types in map type.
    
    Previously this comment is clear that the key type of MapType is string 
type only. Now it looks like all data types can be the key type of MapType.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to