[ https://issues.apache.org/jira/browse/SPARK-37251?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xinrong Meng resolved SPARK-37251. ---------------------------------- Resolution: Invalid > Failed _joinAsOf doctest > ------------------------ > > Key: SPARK-37251 > URL: https://issues.apache.org/jira/browse/SPARK-37251 > Project: Spark > Issue Type: Sub-task > Components: PySpark > Affects Versions: 3.3.0 > Reporter: Xinrong Meng > Priority: Major > > {code:java} > File "/Users/xinrong.meng/spark/python/pyspark/sql/dataframe.py", line 1523, > in pyspark.sql.dataframe.DataFrame._joinAsOf > Failed example: > left._joinAsOf( > right, leftAsOfColumn="a", rightAsOfColumn="a", how="left", > tolerance=F.lit(1) > ).select(left.a, 'left_val', 'right_val').sort("a").collect() > Exception raised: > Traceback (most recent call last): > File "/opt/miniconda3/envs/py10/lib/python3.10/doctest.py", line 1348, > in __run > exec(compile(example.source, filename, "single", > File "<doctest pyspark.sql.dataframe.DataFrame._joinAsOf[5]>", line 1, > in <module> > left._joinAsOf( > File "/Users/xinrong.meng/spark/python/pyspark/sql/dataframe.py", line > 1578, in _joinAsOf > jdf = self._jdf.joinAsOf( > File > "/Users/xinrong.meng/spark/python/lib/py4j-0.10.9.2-src.zip/py4j/java_gateway.py", > line 1309, in __call__ > return_value = get_return_value( > File "/Users/xinrong.meng/spark/python/pyspark/sql/utils.py", line 178, > in deco > return f(*a, **kw) > File > "/Users/xinrong.meng/spark/python/lib/py4j-0.10.9.2-src.zip/py4j/protocol.py", > line 330, in get_return_value > raise Py4JError( > py4j.protocol.Py4JError: An error occurred while calling o283.joinAsOf. > Trace: > py4j.Py4JException: Method joinAsOf([class org.apache.spark.sql.Dataset, > class org.apache.spark.sql.Column, class org.apache.spark.sql.Column, null, > class java.lang.String, class org.apache.spark.sql.Column, class > java.lang.Boolean, class java.lang.String]) does not exist > {code} -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org