[ https://issues.apache.org/jira/browse/SPARK-36232?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-36232. ---------------------------------- Fix Version/s: 3.3.0 Resolution: Fixed Issue resolved by pull request 34299 [https://github.com/apache/spark/pull/34299] > Support creating a ps.Series/Index with `Decimal('NaN')` with Arrow disabled > ---------------------------------------------------------------------------- > > Key: SPARK-36232 > URL: https://issues.apache.org/jira/browse/SPARK-36232 > Project: Spark > Issue Type: Sub-task > Components: PySpark > Affects Versions: 3.2.0 > Reporter: Xinrong Meng > Priority: Major > Fix For: 3.3.0 > > > > {code:java} > >>> import decimal as d > >>> import pyspark.pandas as ps > >>> import numpy as np > >>> ps.utils.default_session().conf.set('spark.sql.execution.arrow.pyspark.enabled', > >>> True) > >>> ps.Series([d.Decimal(1.0), d.Decimal(2.0), d.Decimal(np.nan)]) > 0 1 > 1 2 > 2 None > dtype: object > >>> ps.utils.default_session().conf.set('spark.sql.execution.arrow.pyspark.enabled', > >>> False) > >>> ps.Series([d.Decimal(1.0), d.Decimal(2.0), d.Decimal(np.nan)]) > 21/07/02 15:01:07 ERROR Executor: Exception in task 6.0 in stage 13.0 (TID 51) > net.razorvine.pickle.PickleException: problem construction object: > java.lang.reflect.InvocationTargetException > ... > {code} > As the code is shown above, we cannot create a Series with `Decimal('NaN')` > when Arrow disabled. We ought to fix that. > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org