Yikun Jiang created SPARK-36031: ----------------------------------- Summary: Keep same behavior with pandas for operations of series with nan Key: SPARK-36031 URL: https://issues.apache.org/jira/browse/SPARK-36031 Project: Spark Issue Type: Sub-task Components: PySpark Affects Versions: 3.2.0, 3.3.0 Reporter: Yikun Jiang
There are many operations for series doen't follow the pandas, such as: >>> pser = pd.Series([1, 2, np.nan], dtype=float) >>> psser = ps.from_pandas(pser) >>> pser.astype(int) Traceback (most recent call last): File "/Users/jiangyikun/venv36/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 3343, in run_code exec(code_obj, self.user_global_ns, self.user_ns) File "<ipython-input-30-1ca2ff8756d2>", line 1, in <module> pser.astype(int) File "/Users/jiangyikun/venv36/lib/python3.6/site-packages/pandas/core/generic.py", line 5548, in astype new_data = self._mgr.astype(dtype=dtype, copy=copy, errors=errors,) File "/Users/jiangyikun/venv36/lib/python3.6/site-packages/pandas/core/internals/managers.py", line 604, in astype return self.apply("astype", dtype=dtype, copy=copy, errors=errors) File "/Users/jiangyikun/venv36/lib/python3.6/site-packages/pandas/core/internals/managers.py", line 409, in apply applied = getattr(b, f)(**kwargs) File "/Users/jiangyikun/venv36/lib/python3.6/site-packages/pandas/core/internals/blocks.py", line 595, in astype values = astype_nansafe(vals1d, dtype, copy=True) File "/Users/jiangyikun/venv36/lib/python3.6/site-packages/pandas/core/dtypes/cast.py", line 968, in astype_nansafe raise ValueError("Cannot convert non-finite values (NA or inf) to integer") ValueError: Cannot convert non-finite values (NA or inf) to integer >>> psser.astype(int) 0 1.0 1 2.0 2 NaN dtype: float64 >>> pser = pd.Series([1, 2, np.nan], dtype=float) >>> psser = ps.from_pandas(pser) >>> pser ** False pser ** False Out[6]: 0 1.0 1 1.0 2 1.0 dtype: float64 >>> psser ** False 0 1.0 1 1.0 2 NaN dtype: float64 -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org