Re: java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor

2014-12-01 Thread lokeshkumar
The workaround was to wrap the map returned by spark libraries into HashMap
and then broadcast them.
Could anyone please let me know if there is any issue open? 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/java-io-InvalidClassException-org-apache-spark-api-java-JavaUtils-SerializableMapWrapper-no-valid-cor-tp20034p20070.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor

2014-12-01 Thread Josh Rosen
SerializableMapWrapper was added in
https://issues.apache.org/jira/browse/SPARK-3926; do you mind opening a new
JIRA and linking it to that one?

On Mon, Dec 1, 2014 at 12:17 AM, lokeshkumar lok...@dataken.net wrote:

 The workaround was to wrap the map returned by spark libraries into HashMap
 and then broadcast them.
 Could anyone please let me know if there is any issue open?



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/java-io-InvalidClassException-org-apache-spark-api-java-JavaUtils-SerializableMapWrapper-no-valid-cor-tp20034p20070.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor

2014-11-29 Thread lokeshkumar
Hi forum, 

We have been using spark 1.1.0 and due to some bugs in it, we upgraded to
latest 1.3.0 from the master branch. 
And we are getting the below error while using the broadcast variable. 

Could anyone please point out whats wrong here? 

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in
stage 815081.0 failed 4 times, most recent failure: Lost task 0.3 in stage
815081.0 (TID 4751, ns2.x.net): java.io.InvalidClassException:
org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid
constructor 
at
java.io.ObjectStreamClass$ExceptionInfo.newInvalidClassException(ObjectStreamClass.java:150)
 
at
java.io.ObjectStreamClass.checkDeserialize(ObjectStreamClass.java:768) 
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775) 
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) 
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371) 
at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
 
at
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:216)
 
at
org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:177)
 
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1000) 
at
org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:164)
 
at
org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:64)
 
at
org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:64) 
at
org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:87) 
at
com.xxx.common.chores.kafka.kafkaListenerChore$4$1.call(kafkaListenerChore.java:418)
 
at
com.xxx.common.chores.kafka.kafkaListenerChore$4$1.call(kafkaListenerChore.java:406)
 
at
org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1.apply(JavaRDDLike.scala:195)
 
at
org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1.apply(JavaRDDLike.scala:195)
 
at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:775) 
at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:775) 
at
org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314) 
at
org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314) 
at
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61) 
at org.apache.spark.scheduler.Task.run(Task.scala:56) 
at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196) 
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
at java.lang.Thread.run(Thread.java:745) 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/java-io-InvalidClassException-org-apache-spark-api-java-JavaUtils-SerializableMapWrapper-no-valid-cor-tp20034.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor

2014-11-29 Thread lokeshkumar
Hi forum,

We have been using spark 1.1.0 and due to some bugs in it, we upgraded to
latest 1.3.0 from the master branch.
And we are getting the below error while using the broadcast variable.

Could anyone please point out whats wrong here?

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in
stage 815081.0 failed 4 times, most recent failure: Lost task 0.3 in stage
815081.0 (TID 4751, ns2.dataken.net): java.io.InvalidClassException:
org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid
constructor
at
java.io.ObjectStreamClass$ExceptionInfo.newInvalidClassException(ObjectStreamClass.java:150)
at 
java.io.ObjectStreamClass.checkDeserialize(ObjectStreamClass.java:768)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
at
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:216)
at
org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:177)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1000)
at
org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:164)
at
org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:64)
at
org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:64)
at
org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:87)
at
com.xxx.common.chores.kafka.kafkaListenerChore$4$1.call(kafkaListenerChore.java:418)
at
com.xxx.common.chores.kafka.kafkaListenerChore$4$1.call(kafkaListenerChore.java:406)
at
org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1.apply(JavaRDDLike.scala:195)
at
org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1.apply(JavaRDDLike.scala:195)
at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:775)
at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:775)
at
org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314)
at
org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:56)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/java-io-InvalidClassException-org-apache-spark-api-java-JavaUtils-SerializableMapWrapper-no-valid-cor-tp20033.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org