Yikun Jiang created SPARK-39611:
-----------------------------------

             Summary: PySpark support numpy 1.23.X
                 Key: SPARK-39611
                 URL: https://issues.apache.org/jira/browse/SPARK-39611
             Project: Spark
          Issue Type: Sub-task
          Components: Build
    Affects Versions: 3.4.0
            Reporter: Yikun Jiang


 
{code:java}
```
starting mypy annotations test...
annotations failed mypy checks:
python/pyspark/pandas/frame.py:9970: error: Need type annotation for 
"raveled_column_labels"  [var-annotated]
Found 1 error in 1 file (checked 337 source files)
``` {code}
{code:java}
====================================================================== ERROR 
[2.102s]: test_arithmetic_op_exceptions 
(pyspark.pandas.tests.test_series_datetime.SeriesDateTimeTest) 
---------------------------------------------------------------------- 
Traceback (most recent call last): File 
"/__w/spark/spark/python/pyspark/pandas/tests/test_series_datetime.py", line 
99, in test_arithmetic_op_exceptions self.assertRaisesRegex(TypeError, 
expected_err_msg, lambda: other / psser) File 
"/usr/lib/python3.9/unittest/case.py", line 1276, in assertRaisesRegex return 
context.handle('assertRaisesRegex', args, kwargs) File 
"/usr/lib/python3.9/unittest/case.py", line 201, in handle callable_obj(*args, 
**kwargs) File 
"/__w/spark/spark/python/pyspark/pandas/tests/test_series_datetime.py", line 
99, in <lambda> self.assertRaisesRegex(TypeError, expected_err_msg, lambda: 
other / psser) File "/__w/spark/spark/python/pyspark/pandas/base.py", line 465, 
in __array_ufunc__ raise NotImplementedError( NotImplementedError: 
pandas-on-Spark objects currently do not support <ufunc 'divide'>. 
----------------------------------------------------------------------
{code}
 

 



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to