Sandeep Singh created SPARK-41898:
-------------------------------------

             Summary: Window.rowsBetween should handle `float("-inf")` and 
`float("+inf")` as argument
                 Key: SPARK-41898
                 URL: https://issues.apache.org/jira/browse/SPARK-41898
             Project: Spark
          Issue Type: Sub-task
          Components: Connect
    Affects Versions: 3.4.0
            Reporter: Sandeep Singh


PySpark throws Py4JJavaError where as connect throws SparkConnectException
{code:java}
from pyspark.sql.functions import assert_true

df = self.spark.range(3)

self.assertEqual(
    df.select(assert_true(df.id < 3)).toDF("val").collect(),
    [Row(val=None), Row(val=None), Row(val=None)],
)

with self.assertRaises(Py4JJavaError) as cm:
    df.select(assert_true(df.id < 2, "too big")).toDF("val").collect(){code}
{code:java}
Traceback (most recent call last):
  File 
"/Users/s.singh/personal/spark-oss/python/pyspark/sql/tests/test_functions.py", 
line 950, in test_assert_true
    df.select(assert_true(df.id < 2, "too big")).toDF("val").collect()
  File 
"/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/dataframe.py", 
line 1076, in collect
    table = self._session.client.to_table(query)
  File 
"/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/client.py", line 
414, in to_table
    table, _ = self._execute_and_fetch(req)
  File 
"/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/client.py", line 
586, in _execute_and_fetch
    self._handle_error(rpc_error)
  File 
"/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/client.py", line 
629, in _handle_error
    raise SparkConnectException(status.message, info.reason) from None
pyspark.sql.connect.client.SparkConnectException: (java.lang.RuntimeException) 
too big {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to