[ https://issues.apache.org/jira/browse/SPARK-41774?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-41774: ---------------------------------- Issue Type: Test (was: Improvement) > Remove def test_vectorized_udf_unsupported_types > ------------------------------------------------ > > Key: SPARK-41774 > URL: https://issues.apache.org/jira/browse/SPARK-41774 > Project: Spark > Issue Type: Test > Components: Pandas API on Spark > Affects Versions: 3.4.0 > Reporter: Bjørn Jørgensen > Assignee: Bjørn Jørgensen > Priority: Trivial > Fix For: 3.4.0 > > > https://github.com/apache/spark/blob/18488158beee5435f99899f99b2e90fb6e37f3d5/python/pyspark/sql/tests/pandas/test_pandas_udf_scalar.py#L603 > {code:java} > def test_vectorized_udf_wrong_return_type(self): > with QuietTest(self.sc): > for udf_type in [PandasUDFType.SCALAR, PandasUDFType.SCALAR_ITER]: > with self.assertRaisesRegex( > NotImplementedError, > "Invalid return type.*scalar Pandas > UDF.*ArrayType.*TimestampType", > ): > pandas_udf(lambda x: x, ArrayType(TimestampType()), > udf_type) > {code} > is the same code as > https://github.com/apache/spark/blob/18488158beee5435f99899f99b2e90fb6e37f3d5/python/pyspark/sql/tests/pandas/test_pandas_udf_scalar.py#L679 > {code:java} > def test_vectorized_udf_unsupported_types(self): > with QuietTest(self.sc): > for udf_type in [PandasUDFType.SCALAR, PandasUDFType.SCALAR_ITER]: > with self.assertRaisesRegex( > NotImplementedError, > "Invalid return type.*scalar Pandas > UDF.*ArrayType.*TimestampType", > ): > pandas_udf(lambda x: x, ArrayType(TimestampType()), > udf_type) > {code} > So we can remove one or fix the typo. > I found this using > [sonor|https://sonarcloud.io/project/issues?languages=py&resolved=false&rules=python%3AS4144&id=spark-python&open=AYQdnW-FRrJbVxW9ZDO0] > -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org