I have no idea... We use scala. You upgrade to 1.4 so quickly...,  are you 
using spark in production?  Spark 1.3 is better than spark1.4. 


------------------ ???????? ------------------
??????: "??????";<z.qian...@gmail.com>;
????????: 2015??8??14??(??????) ????11:14
??????: "Sea"<261810...@qq.com>; "dev@spark.apache.org"<dev@spark.apache.org>; 

????: Re: please help with ClassNotFoundException



Hi Sea     I have updated spark to 1.4.1, however the problem still exists, any 
idea?


Sea <261810...@qq.com>??2015??8??14?????? ????12:36??????

Yes, I guess so. I see this bug before.




------------------ ???????? ------------------
??????: "??????";<z.qian...@gmail.com>;
????????: 2015??8??13??(??????) ????9:30
??????: "Sea"<261810...@qq.com>; "dev@spark.apache.org"<dev@spark.apache.org>; 

????: Re: please help with ClassNotFoundException




Hi sea    Is it the same issue as 
https://issues.apache.org/jira/browse/SPARK-8368


Sea <261810...@qq.com>??2015??8??13?????? ????6:52??????

Are you using 1.4.0?  If yes, use 1.4.1




------------------ ???????? ------------------
??????: "??????";<qhz...@apache.org>;
????????: 2015??8??13??(??????) ????6:04
??????: "dev"<dev@spark.apache.org>; 

????: please help with ClassNotFoundException




Hi,    I am using spark 1.4 when an issue occurs to me.
    I am trying to use the aggregate function:
    JavaRdd<String> rdd = some rdd;
    HashMap<Long, TypeA> zeroValue = new HashMap();
    // add initial key-value pair for zeroValue
    rdd.aggregate(zeroValue, 
                   new Function2<HashMap<Long, TypeA>,
                        String,
                        HashMap<Long, TypeA>>(){//implementation},
                   new Function2<HashMap<Long, TypeA>,
                        String,
                        HashMap<Long, TypeA>(){//implementation})


    here is the stack trace when i run the application:


Caused by: java.lang.ClassNotFoundException: TypeA
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:274)
        at 
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at java.util.HashMap.readObject(HashMap.java:1180)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:89)
        at org.apache.spark.util.Utils$.clone(Utils.scala:1458)
        at org.apache.spark.rdd.RDD$$anonfun$aggregate$1.apply(RDD.scala:1049)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
        at org.apache.spark.rdd.RDD.aggregate(RDD.scala:1047)
        at 
org.apache.spark.api.java.JavaRDDLike$class.aggregate(JavaRDDLike.scala:413)
        at 
org.apache.spark.api.java.AbstractJavaRDDLike.aggregate(JavaRDDLike.scala:47)

     however I have checked that TypeA is in the jar file which is in the 
classpath
    And when I use an empty HashMap as the zeroValue, the exception has gone
    Does anyone meet the same problem, or can anyone help me with it?



-- 

Best RegardZhouQianhao



-- 

Best RegardZhouQianhao

Reply via email to