Re: please help with ClassNotFoundException

2015-08-14 Thread 周千昊
Hi, Sea
 Problem solved, it turn out to be that I have updated spark cluster to
1.4.1, however the client has not been updated.
 Thank you so much.

Sea 261810...@qq.com于2015年8月14日周五 下午1:01写道:

 I have no idea... We use scala. You upgrade to 1.4 so quickly...,  are you
 using spark in production?  Spark 1.3 is better than spark1.4.

 -- 原始邮件 --
 *发件人:* 周千昊;z.qian...@gmail.com;
 *发送时间:* 2015年8月14日(星期五) 中午11:14
 *收件人:* Sea261810...@qq.com; dev@spark.apache.org
 dev@spark.apache.org;
 *主题:* Re: please help with ClassNotFoundException

 Hi Sea
  I have updated spark to 1.4.1, however the problem still exists, any
 idea?

 Sea 261810...@qq.com于2015年8月14日周五 上午12:36写道:

 Yes, I guess so. I see this bug before.


 -- 原始邮件 --
 *发件人:* 周千昊;z.qian...@gmail.com;
 *发送时间:* 2015年8月13日(星期四) 晚上9:30
 *收件人:* Sea261810...@qq.com; dev@spark.apache.org
 dev@spark.apache.org;
 *主题:* Re: please help with ClassNotFoundException

 Hi sea
 Is it the same issue as
 https://issues.apache.org/jira/browse/SPARK-8368

 Sea 261810...@qq.com于2015年8月13日周四 下午6:52写道:

 Are you using 1.4.0?  If yes, use 1.4.1


 -- 原始邮件 --
 *发件人:* 周千昊;qhz...@apache.org;
 *发送时间:* 2015年8月13日(星期四) 晚上6:04
 *收件人:* devdev@spark.apache.org;
 *主题:* please help with ClassNotFoundException

 Hi,
 I am using spark 1.4 when an issue occurs to me.
 I am trying to use the aggregate function:
 JavaRddString rdd = some rdd;
 HashMapLong, TypeA zeroValue = new HashMap();
 // add initial key-value pair for zeroValue
 rdd.aggregate(zeroValue,
new Function2HashMapLong, TypeA,
 String,
 HashMapLong, TypeA(){//implementation},
new Function2HashMapLong, TypeA,
 String,
 HashMapLong, TypeA(){//implementation})

 here is the stack trace when i run the application:

 Caused by: java.lang.ClassNotFoundException: TypeA
 at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
 at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
 at java.lang.Class.forName0(Native Method)
 at java.lang.Class.forName(Class.java:274)
 at
 org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
 at
 java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
 at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
 at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
 at java.util.HashMap.readObject(HashMap.java:1180)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at
 java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
 at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
 at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
 at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
 at
 org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
 at
 org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:89)
 at org.apache.spark.util.Utils$.clone(Utils.scala:1458)
 at org.apache.spark.rdd.RDD$$anonfun$aggregate$1.apply(RDD.scala:1049)
 at
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
 at
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
 at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
 at org.apache.spark.rdd.RDD.aggregate(RDD.scala:1047)
 at
 org.apache.spark.api.java.JavaRDDLike$class.aggregate(JavaRDDLike.scala:413)
 at
 org.apache.spark.api.java.AbstractJavaRDDLike.aggregate(JavaRDDLike.scala:47)
  *however I have checked that TypeA is in the jar file which is in
 the classpath*
 *And when I use an empty HashMap as the zeroValue, the exception has
 gone*
 *Does anyone meet the same problem, or can anyone help me with it?*

 --
 Best Regard
 ZhouQianhao

 --
 Best Regard
 ZhouQianhao



Re?? please help with ClassNotFoundException

2015-08-13 Thread Sea
Yes, I guess so. I see this bug before.




--  --
??: ??;z.qian...@gmail.com;
: 2015??8??13??(??) 9:30
??: Sea261810...@qq.com; dev@spark.apache.orgdev@spark.apache.org; 

: Re: please help with ClassNotFoundException



Hi seaIs it the same issue as 
https://issues.apache.org/jira/browse/SPARK-8368


Sea 261810...@qq.com??2015??8??13?? 6:52??

Are you using 1.4.0?  If yes, use 1.4.1




--  --
??: ??;qhz...@apache.org;
: 2015??8??13??(??) 6:04
??: devdev@spark.apache.org; 

: please help with ClassNotFoundException




Hi,I am using spark 1.4 when an issue occurs to me.
I am trying to use the aggregate function:
JavaRddString rdd = some rdd;
HashMapLong, TypeA zeroValue = new HashMap();
// add initial key-value pair for zeroValue
rdd.aggregate(zeroValue, 
   new Function2HashMapLong, TypeA,
String,
HashMapLong, TypeA(){//implementation},
   new Function2HashMapLong, TypeA,
String,
HashMapLong, TypeA(){//implementation})


here is the stack trace when i run the application:


Caused by: java.lang.ClassNotFoundException: TypeA
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at 
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at java.util.HashMap.readObject(HashMap.java:1180)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:89)
at org.apache.spark.util.Utils$.clone(Utils.scala:1458)
at org.apache.spark.rdd.RDD$$anonfun$aggregate$1.apply(RDD.scala:1049)
at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
at org.apache.spark.rdd.RDD.aggregate(RDD.scala:1047)
at 
org.apache.spark.api.java.JavaRDDLike$class.aggregate(JavaRDDLike.scala:413)
at 
org.apache.spark.api.java.AbstractJavaRDDLike.aggregate(JavaRDDLike.scala:47)

 however I have checked that TypeA is in the jar file which is in the 
classpath
And when I use an empty HashMap as the zeroValue, the exception has gone
Does anyone meet the same problem, or can anyone help me with it?



-- 

Best RegardZhouQianhao

Re: please help with ClassNotFoundException

2015-08-13 Thread 周千昊
Hi sea
Is it the same issue as https://issues.apache.org/jira/browse/SPARK-8368

Sea 261810...@qq.com于2015年8月13日周四 下午6:52写道:

 Are you using 1.4.0?  If yes, use 1.4.1


 -- 原始邮件 --
 *发件人:* 周千昊;qhz...@apache.org;
 *发送时间:* 2015年8月13日(星期四) 晚上6:04
 *收件人:* devdev@spark.apache.org;
 *主题:* please help with ClassNotFoundException

 Hi,
 I am using spark 1.4 when an issue occurs to me.
 I am trying to use the aggregate function:
 JavaRddString rdd = some rdd;
 HashMapLong, TypeA zeroValue = new HashMap();
 // add initial key-value pair for zeroValue
 rdd.aggregate(zeroValue,
new Function2HashMapLong, TypeA,
 String,
 HashMapLong, TypeA(){//implementation},
new Function2HashMapLong, TypeA,
 String,
 HashMapLong, TypeA(){//implementation})

 here is the stack trace when i run the application:

 Caused by: java.lang.ClassNotFoundException: TypeA
 at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
 at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
 at java.lang.Class.forName0(Native Method)
 at java.lang.Class.forName(Class.java:274)
 at
 org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
 at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
 at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
 at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
 at java.util.HashMap.readObject(HashMap.java:1180)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
 at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
 at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
 at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
 at
 org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
 at
 org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:89)
 at org.apache.spark.util.Utils$.clone(Utils.scala:1458)
 at org.apache.spark.rdd.RDD$$anonfun$aggregate$1.apply(RDD.scala:1049)
 at
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
 at
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
 at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
 at org.apache.spark.rdd.RDD.aggregate(RDD.scala:1047)
 at
 org.apache.spark.api.java.JavaRDDLike$class.aggregate(JavaRDDLike.scala:413)
 at
 org.apache.spark.api.java.AbstractJavaRDDLike.aggregate(JavaRDDLike.scala:47)
  *however I have checked that TypeA is in the jar file which is in
 the classpath*
 *And when I use an empty HashMap as the zeroValue, the exception has
 gone*
 *Does anyone meet the same problem, or can anyone help me with it?*

-- 
Best Regard
ZhouQianhao


Re?? please help with ClassNotFoundException

2015-08-13 Thread Sea
I have no idea... We use scala. You upgrade to 1.4 so quickly...,  are you 
using spark in production?  Spark 1.3 is better than spark1.4. 


--  --
??: ??;z.qian...@gmail.com;
: 2015??8??14??(??) 11:14
??: Sea261810...@qq.com; dev@spark.apache.orgdev@spark.apache.org; 

: Re: please help with ClassNotFoundException



Hi Sea I have updated spark to 1.4.1, however the problem still exists, any 
idea?


Sea 261810...@qq.com??2015??8??14?? 12:36??

Yes, I guess so. I see this bug before.




--  --
??: ??;z.qian...@gmail.com;
: 2015??8??13??(??) 9:30
??: Sea261810...@qq.com; dev@spark.apache.orgdev@spark.apache.org; 

: Re: please help with ClassNotFoundException




Hi seaIs it the same issue as 
https://issues.apache.org/jira/browse/SPARK-8368


Sea 261810...@qq.com??2015??8??13?? 6:52??

Are you using 1.4.0?  If yes, use 1.4.1




--  --
??: ??;qhz...@apache.org;
: 2015??8??13??(??) 6:04
??: devdev@spark.apache.org; 

: please help with ClassNotFoundException




Hi,I am using spark 1.4 when an issue occurs to me.
I am trying to use the aggregate function:
JavaRddString rdd = some rdd;
HashMapLong, TypeA zeroValue = new HashMap();
// add initial key-value pair for zeroValue
rdd.aggregate(zeroValue, 
   new Function2HashMapLong, TypeA,
String,
HashMapLong, TypeA(){//implementation},
   new Function2HashMapLong, TypeA,
String,
HashMapLong, TypeA(){//implementation})


here is the stack trace when i run the application:


Caused by: java.lang.ClassNotFoundException: TypeA
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at 
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at java.util.HashMap.readObject(HashMap.java:1180)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:89)
at org.apache.spark.util.Utils$.clone(Utils.scala:1458)
at org.apache.spark.rdd.RDD$$anonfun$aggregate$1.apply(RDD.scala:1049)
at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
at org.apache.spark.rdd.RDD.aggregate(RDD.scala:1047)
at 
org.apache.spark.api.java.JavaRDDLike$class.aggregate(JavaRDDLike.scala:413)
at 
org.apache.spark.api.java.AbstractJavaRDDLike.aggregate(JavaRDDLike.scala:47)

 however I have checked that TypeA is in the jar file which is in the 
classpath
And when I use an empty HashMap as the zeroValue, the exception has gone
Does anyone meet the same problem, or can anyone help me with it?



-- 

Best RegardZhouQianhao



-- 

Best RegardZhouQianhao

Re: please help with ClassNotFoundException

2015-08-13 Thread 周千昊
Hi Sea
 I have updated spark to 1.4.1, however the problem still exists, any
idea?

Sea 261810...@qq.com于2015年8月14日周五 上午12:36写道:

 Yes, I guess so. I see this bug before.


 -- 原始邮件 --
 *发件人:* 周千昊;z.qian...@gmail.com;
 *发送时间:* 2015年8月13日(星期四) 晚上9:30
 *收件人:* Sea261810...@qq.com; dev@spark.apache.org
 dev@spark.apache.org;
 *主题:* Re: please help with ClassNotFoundException

 Hi sea
 Is it the same issue as
 https://issues.apache.org/jira/browse/SPARK-8368

 Sea 261810...@qq.com于2015年8月13日周四 下午6:52写道:

 Are you using 1.4.0?  If yes, use 1.4.1


 -- 原始邮件 --
 *发件人:* 周千昊;qhz...@apache.org;
 *发送时间:* 2015年8月13日(星期四) 晚上6:04
 *收件人:* devdev@spark.apache.org;
 *主题:* please help with ClassNotFoundException

 Hi,
 I am using spark 1.4 when an issue occurs to me.
 I am trying to use the aggregate function:
 JavaRddString rdd = some rdd;
 HashMapLong, TypeA zeroValue = new HashMap();
 // add initial key-value pair for zeroValue
 rdd.aggregate(zeroValue,
new Function2HashMapLong, TypeA,
 String,
 HashMapLong, TypeA(){//implementation},
new Function2HashMapLong, TypeA,
 String,
 HashMapLong, TypeA(){//implementation})

 here is the stack trace when i run the application:

 Caused by: java.lang.ClassNotFoundException: TypeA
 at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
 at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
 at java.lang.Class.forName0(Native Method)
 at java.lang.Class.forName(Class.java:274)
 at
 org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
 at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
 at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
 at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
 at java.util.HashMap.readObject(HashMap.java:1180)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
 at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
 at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
 at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
 at
 org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
 at
 org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:89)
 at org.apache.spark.util.Utils$.clone(Utils.scala:1458)
 at org.apache.spark.rdd.RDD$$anonfun$aggregate$1.apply(RDD.scala:1049)
 at
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
 at
 org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
 at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
 at org.apache.spark.rdd.RDD.aggregate(RDD.scala:1047)
 at
 org.apache.spark.api.java.JavaRDDLike$class.aggregate(JavaRDDLike.scala:413)
 at
 org.apache.spark.api.java.AbstractJavaRDDLike.aggregate(JavaRDDLike.scala:47)
  *however I have checked that TypeA is in the jar file which is in
 the classpath*
 *And when I use an empty HashMap as the zeroValue, the exception has
 gone*
 *Does anyone meet the same problem, or can anyone help me with it?*

 --
 Best Regard
 ZhouQianhao

-- 
Best Regard
ZhouQianhao