Sandeep Singh created SPARK-41838: ------------------------------------- Summary: DataFrame.show() fix map printing Key: SPARK-41838 URL: https://issues.apache.org/jira/browse/SPARK-41838 Project: Spark Issue Type: Sub-task Components: Connect Affects Versions: 3.4.0 Reporter: Sandeep Singh
{code:java} File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/functions.py", line 1594, in pyspark.sql.connect.functions.to_json Failed example: df = spark.createDataFrame(data, ("key", "value")) Exception raised: Traceback (most recent call last): File "/usr/local/Cellar/python@3.10/3.10.8/Frameworks/Python.framework/Versions/3.10/lib/python3.10/doctest.py", line 1350, in __run exec(compile(example.source, filename, "single", File "<doctest pyspark.sql.connect.functions.to_json[3]>", line 1, in <module> df = spark.createDataFrame(data, ("key", "value")) File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/session.py", line 252, in createDataFrame table = pa.Table.from_pandas(pdf) File "pyarrow/table.pxi", line 3475, in pyarrow.lib.Table.from_pandas File "/usr/local/lib/python3.10/site-packages/pyarrow/pandas_compat.py", line 611, in dataframe_to_arrays arrays = [convert_column(c, f) File "/usr/local/lib/python3.10/site-packages/pyarrow/pandas_compat.py", line 611, in <listcomp> arrays = [convert_column(c, f) File "/usr/local/lib/python3.10/site-packages/pyarrow/pandas_compat.py", line 598, in convert_column raise e File "/usr/local/lib/python3.10/site-packages/pyarrow/pandas_compat.py", line 592, in convert_column result = pa.array(col, type=type_, from_pandas=True, safe=safe) File "pyarrow/array.pxi", line 316, in pyarrow.lib.array File "pyarrow/array.pxi", line 83, in pyarrow.lib._ndarray_to_array File "pyarrow/error.pxi", line 100, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: ("Could not convert 'Alice' with type str: tried to convert to int64", 'Conversion failed for column 1 with type object'){code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org