[ https://issues.apache.org/jira/browse/SPARK-36337?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-36337. ---------------------------------- Fix Version/s: 3.3.0 Resolution: Fixed Issue resolved by pull request 34285 [https://github.com/apache/spark/pull/34285] > decimal('Nan') is unsupported in net.razorvine.pickle > ------------------------------------------------------ > > Key: SPARK-36337 > URL: https://issues.apache.org/jira/browse/SPARK-36337 > Project: Spark > Issue Type: Sub-task > Components: PySpark > Affects Versions: 3.2.0 > Reporter: Yikun Jiang > Assignee: Yikun Jiang > Priority: Major > Fix For: 3.3.0 > > > Decimal('NaN') is not supported by net.razorvine.pickle now. > In Python > {code:java} > >>> pickled = cloudpickle.dumps(decimal.Decimal('NaN')) > b'\x80\x05\x95!\x00\x00\x00\x00\x00\x00\x00\x8c\x07decimal\x94\x8c\x07Decimal\x94\x93\x94\x8c\x03NaN\x94\x85\x94R\x94.' > >>> pickle.loads(pickled) > Decimal('NaN') > {code} > In Scala > {code:java} > scala> import net.razorvine.pickle.\{Pickler, Unpickler, PickleUtils} > scala> val unpickle = new Unpickler > scala> > unpickle.loads(PickleUtils.str2bytes("\u0080\u0005\u0095!\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u008c\u0007decimal\u0094\u008c\u0007Decimal\u0094\u0093\u0094\u008c\u0003NaN\u0094\u0085\u0094R\u0094.")) > net.razorvine.pickle.PickleException: problem construction object: > java.lang.reflect.InvocationTargetException > at > net.razorvine.pickle.objects.AnyClassConstructor.construct(AnyClassConstructor.java:29) > at net.razorvine.pickle.Unpickler.load_reduce(Unpickler.java:773) > at net.razorvine.pickle.Unpickler.dispatch(Unpickler.java:213) > at net.razorvine.pickle.Unpickler.load(Unpickler.java:123) > at net.razorvine.pickle.Unpickler.loads(Unpickler.java:136) > ... 48 elided > {code} > I submit an issue in pickle upstream > [https://github.com/irmen/pickle/issues/7] . > we should bump pickle latest version after it fixed. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org