viirya commented on a change in pull request #22807: [SPARK-25811][PySpark] Raise a proper error when unsafe cast is detected by PyArrow URL: https://github.com/apache/spark/pull/22807#discussion_r246630915
########## File path: python/pyspark/sql/tests.py ########## @@ -4961,6 +4961,31 @@ def foofoo(x, y): ).collect ) + def test_pandas_udf_detect_unsafe_type_conversion(self): + from distutils.version import LooseVersion + from pyspark.sql.functions import pandas_udf + import pandas as pd + import numpy as np + import pyarrow as pa + + values = [1.0] * 3 + pdf = pd.DataFrame({'A': values}) + df = self.spark.createDataFrame(pdf).repartition(1) + + @pandas_udf(returnType="int") + def udf(column): + return pd.Series(np.linspace(0, 1, 3)) + + udf_boolean = df.select(['A']).withColumn('udf', udf('A')) + + # Since 0.11.0, PyArrow supports the feature to raise an error for unsafe cast. + if LooseVersion(pa.__version__) >= LooseVersion("0.11.0"): Review comment: Yeah but in pyarrow 0.11.0+ you'd see an error: ```python >>> import pandas as pd >>> import pyarrow as pa >>> pa.__version__ '0.11.1' >>> pa.Array.from_pandas(pd.Series([128.0]), type=pa.int8()) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "pyarrow/array.pxi", line 474, in pyarrow.lib.Array.from_pandas File "pyarrow/array.pxi", line 169, in pyarrow.lib.array File "pyarrow/array.pxi", line 69, in pyarrow.lib._ndarray_to_array File "pyarrow/error.pxi", line 81, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: Floating point value truncated >>> pa.Array.from_pandas(pd.Series([128.0]), type=pa.int8(), safe=False) <pyarrow.lib.Int8Array object at 0x7f3ee1a4b868> [ -128 ] ``` ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org