Hi,
I am trying to send a numpy array as an argument to a function predict() in a class in spark/python/pyspark/mllib/clustering.py which is passed to the function callMLlibFunc(name, *args) in spark/python/pyspark/mllib/common.py.

Now the value is passed to the function _py2java(sc, obj) .Here I am getting an exception

Py4JJavaError: An error occurred while calling 
z:org.apache.spark.mllib.api.python.SerDe.loads.
: net.razorvine.pickle.PickleException: expected zero arguments for 
construction of ClassDict (for numpy.core.multiarray._reconstruct)
        at 
net.razorvine.pickle.objects.ClassDictConstructor.construct(ClassDictConstructor.java:23)
        at net.razorvine.pickle.Unpickler.load_reduce(Unpickler.java:617)
        at net.razorvine.pickle.Unpickler.dispatch(Unpickler.java:170)
        at net.razorvine.pickle.Unpickler.load(Unpickler.java:84)
        at net.razorvine.pickle.Unpickler.loads(Unpickler.java:97)


Why common._py2java(sc, obj) is not handling numpy array type?

Please help..


--

Regards,

*Meethu Mathew*

*Engineer*

*Flytxt*

www.flytxt.com | Visit our blog <http://blog.flytxt.com/> | Follow us <http://www.twitter.com/flytxt> | _Connect on Linkedin <http://www.linkedin.com/home?trk=hb_tab_home_top>_

Reply via email to