[ https://issues.apache.org/jira/browse/SPARK-4113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14188007#comment-14188007 ]
Apache Spark commented on SPARK-4113: ------------------------------------- User 'davies' has created a pull request for this issue: https://github.com/apache/spark/pull/2973 > Pyhon UDF on ArrayType > ---------------------- > > Key: SPARK-4113 > URL: https://issues.apache.org/jira/browse/SPARK-4113 > Project: Spark > Issue Type: Bug > Components: PySpark, SQL > Affects Versions: 1.2.0 > Reporter: Davies Liu > Assignee: Davies Liu > Priority: Blocker > Fix For: 1.2.0 > > > from Matei: > I have a table where column c is of type array<int>. However the following > set of commands fails: > sqlContext.registerFunction("py_func", lambda a: len(a)) > %sql select py_func(c) from some_temp > Error in SQL statement: java.lang.RuntimeException: > org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in > stage 252.0 failed 4 times, most recent failure: Lost task 2.3 in stage 252.0 > (TID 8454, ip-10-0-157-104.us-west-2.compute.internal): > net.razorvine.pickle.PickleException: couldn't introspect javabean: > java.lang.IllegalArgumentException: wrong number of arguments > net.razorvine.pickle.Pickler.put_javabean(Pickler.java:603) > net.razorvine.pickle.Pickler.dispatch(Pickler.java:299) > net.razorvine.pickle.Pickler.save(Pickler.java:125) > net.razorvine.pickle.Pickler.put_arrayOfObjects(Pickler.java:392) > net.razorvine.pickle.Pickler.dispatch(Pickler.java:195) > net.razorvine.pickle.Pickler.save(Pickler.java:125) > net.razorvine.pickle.Pickler.put_arrayOfObjects(Pickler.java:392) > net.razorvine.pickle.Pickler.dispatch(Pickler.java:195) > net.razorvine.pickle.Pickler.save(Pickler.java:125) > net.razorvine.pickle.Pickler.dump(Pickler.java:95) > The same function works if I select a Row from my table into Python and call > it on its third column. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org