[ https://issues.apache.org/jira/browse/SPARK-41757?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon reassigned SPARK-41757: ------------------------------------ Assignee: Hyukjin Kwon > Compatibility of string representation in Column > ------------------------------------------------ > > Key: SPARK-41757 > URL: https://issues.apache.org/jira/browse/SPARK-41757 > Project: Spark > Issue Type: Sub-task > Components: Connect > Affects Versions: 3.4.0 > Reporter: Sandeep Singh > Assignee: Hyukjin Kwon > Priority: Major > > Doctest in pyspark.sql.connect.column.Columnfails with the error below: > {code:java} > File > "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/column.py", > line 120, in pyspark.sql.connect.column.Column > Failed example: > df.name > Expected: > Column<'name'> > Got: > Column<'ColumnReference(name)'> > ********************************************************************** > File > "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/column.py", > line 122, in pyspark.sql.connect.column.Column > Failed example: > df["name"] > Expected: > Column<'name'> > Got: > Column<'ColumnReference(name)'> > ********************************************************************** > File > "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/column.py", > line 127, in pyspark.sql.connect.column.Column > Failed example: > df.age + 1 > Expected: > Column<'(age + 1)'> > Got: > Column<'+(ColumnReference(age), Literal(1))'> > ********************************************************************** > File > "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/column.py", > line 129, in pyspark.sql.connect.column.Column > Failed example: > 1 / df.age > Expected: > Column<'(1 / age)'> > Got: > Column<'/(Literal(1), ColumnReference(age))'> {code} > > We should enable this back after fixing the issue in Spark Connect -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org