[ https://issues.apache.org/jira/browse/SPARK-41852?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sandeep Singh updated SPARK-41852: ---------------------------------- Description: {code:java} File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/functions.py", line 622, in pyspark.sql.connect.functions.pmod Failed example: df.select(pmod("a", "b")).show() Expected: +----------+ |pmod(a, b)| +----------+ | NaN| | NaN| | 1.0| | NaN| | 1.0| | 2.0| | -5.0| | 7.0| | 1.0| +----------+ Got: +----------+ |pmod(a, b)| +----------+ | null| | null| | 1.0| | null| | 1.0| | 2.0| | -5.0| | 7.0| | 1.0| +----------+ <BLANKLINE>{code} was: {code:java} File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/functions.py", line 313, in pyspark.sql.connect.functions.nanvl Failed example: df.select(nanvl("a", "b").alias("r1"), nanvl(df.a, df.b).alias("r2")).collect() Expected: [Row(r1=1.0, r2=1.0), Row(r1=2.0, r2=2.0)] Got: [Row(r1=1.0, r2=1.0), Row(r1=nan, r2=nan)]{code} > Fix `pmod` function > ------------------- > > Key: SPARK-41852 > URL: https://issues.apache.org/jira/browse/SPARK-41852 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark > Affects Versions: 3.4.0 > Reporter: Sandeep Singh > Priority: Major > Fix For: 3.4.0 > > > {code:java} > File > "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/functions.py", > line 622, in pyspark.sql.connect.functions.pmod > Failed example: > df.select(pmod("a", "b")).show() > Expected: > +----------+ > |pmod(a, b)| > +----------+ > | NaN| > | NaN| > | 1.0| > | NaN| > | 1.0| > | 2.0| > | -5.0| > | 7.0| > | 1.0| > +----------+ > Got: > +----------+ > |pmod(a, b)| > +----------+ > | null| > | null| > | 1.0| > | null| > | 1.0| > | 2.0| > | -5.0| > | 7.0| > | 1.0| > +----------+ > <BLANKLINE>{code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org