JustFeng opened a new pull request, #6985:
URL: https://github.com/apache/kyuubi/pull/6985

   [
   [KYUUBI #6984] Fix ValueError when rendering MapType data
   ](https://github.com/apache/kyuubi/issues/6984)
   
   ### Why are the changes needed?
   The issue was caused by an incorrect iteration of MapType data in the 
`%table` magic command. When iterating over a `MapType` column, the code used 
`for k, v in m` directly, which leads to a `ValueError` because raw `Map` 
entries may not be properly unpacked 
   
   ### How was this patch tested?
   - [x] Manual testing:  
     Executed a query with a `MapType` column and confirmed that the `%table` 
command now renders it without errors.  
   ```python
    from pyspark.sql import SparkSession
    from pyspark.sql.types import MapType, StringType, IntegerType
    spark = SparkSession.builder \
        .appName("MapFieldExample") \
        .getOrCreate()
   
    data = [
        (1, {"a": "1", "b": "2"}),     
        (2, {"x": "10"}),
        (3, {"key": "value"})
    ]
   
    schema = "id INT, map_col MAP<STRING, STRING>"
    df = spark.createDataFrame(data, schema=schema)
    df.printSchema()
    df2=df.collect()
   ```
   using `%table` render table
   ```python
    %table df2
   ```
   
   result
   ```python
   {'application/vnd.livy.table.v1+json': {'headers': [{'name': 'id', 'type': 
'INT_TYPE'}, {'name': 'map_col', 'type': 'MAP_TYPE'}], 'data': [[1, {'a': '1', 
'b': '2'}], [2, {'x': '10'}], [3, {'key': 'value'}]]}}
   
   ```
   
   
   ### Was this patch authored or co-authored using generative AI tooling?
   No
   
   **notice** This PR was co-authored by DeepSeek-R1.
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to