[ https://issues.apache.org/jira/browse/SPARK-41988?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon reassigned SPARK-41988: ------------------------------------ Assignee: jiaan.geng > Fix map_filter and map_zip_with output order > -------------------------------------------- > > Key: SPARK-41988 > URL: https://issues.apache.org/jira/browse/SPARK-41988 > Project: Spark > Issue Type: Sub-task > Components: Connect > Affects Versions: 3.4.0 > Reporter: jiaan.geng > Assignee: jiaan.geng > Priority: Major > > {code:java} > File > "/Users/jiaan.geng/git-local/github-forks/spark/python/pyspark/sql/connect/functions.py", > line 1423, in pyspark.sql.connect.functions.map_filter > Failed example: > df.select(map_filter( > "data", lambda _, v: v > 30.0).alias("data_filtered") > ).show(truncate=False) > Expected: > +--------------------------+ > |data_filtered | > +--------------------------+ > |{baz -> 32.0, foo -> 42.0}| > +--------------------------+ > Got: > +--------------------------+ > |data_filtered | > +--------------------------+ > |{foo -> 42.0, baz -> 32.0}| > +--------------------------+ > <BLANKLINE> > ********************************************************************** > File > "/Users/jiaan.geng/git-local/github-forks/spark/python/pyspark/sql/connect/functions.py", > line 1465, in pyspark.sql.connect.functions.map_zip_with > Failed example: > df.select(map_zip_with( > "base", "ratio", lambda k, v1, v2: round(v1 * v2, > 2)).alias("updated_data") > ).show(truncate=False) > Expected: > +---------------------------+ > |updated_data | > +---------------------------+ > |{SALES -> 16.8, IT -> 48.0}| > +---------------------------+ > Got: > +---------------------------+ > |updated_data | > +---------------------------+ > |{IT -> 48.0, SALES -> 16.8}| > +---------------------------+ > <BLANKLINE> > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org