Yang Jie created SPARK-46654: -------------------------------- Summary: df.show() of pyspark displayed different results between Regular Spark and Spark Connect Key: SPARK-46654 URL: https://issues.apache.org/jira/browse/SPARK-46654 Project: Spark Issue Type: Bug Components: Connect, PySpark Affects Versions: 4.0.0 Reporter: Yang Jie
The following doctest will throw an error in the tests of the pyspark-connect module {code:java} Example 2: Converting a complex StructType to a CSV string >>> from pyspark.sql import Row, functions as sf >>> data = [(1, Row(age=2, name='Alice', scores=[100, 200, 300]))] >>> df = spark.createDataFrame(data, ("key", "value")) >>> df.select(sf.to_csv(df.value)).show(truncate=False) # doctest: +SKIP +-----------------------+ |to_csv(value) | +-----------------------+ |2,Alice,"[100,200,300]"| +-----------------------+{code} {code:java} ********************************************************************** 3953File "/__w/spark/spark/python/pyspark/sql/connect/functions/builtin.py", line 2232, in pyspark.sql.connect.functions.builtin.to_csv 3954Failed example: 3955 df.select(sf.to_csv(df.value)).show(truncate=False) 3956Expected: 3957 +-----------------------+ 3958 |to_csv(value) | 3959 +-----------------------+ 3960 |2,Alice,"[100,200,300]"| 3961 +-----------------------+ 3962Got: 3963 +--------------------------------------------------------------------------+ 3964 |to_csv(value) | 3965 +--------------------------------------------------------------------------+ 3966 |2,Alice,org.apache.spark.sql.catalyst.expressions.UnsafeArrayData@99c5e30f| 3967 +--------------------------------------------------------------------------+ 3968 <BLANKLINE> 3969********************************************************************** 3970 1 of 18 in pyspark.sql.connect.functions.builtin.to_csv 3971***Test Failed*** 1 failures. {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org