Re: 1.1.0-SNAPSHOT possible regression
Actually apparently there is a pull request for it. Thanks for reporting! https://github.com/apache/spark/pull/1836 On Fri, Aug 8, 2014 at 10:50 AM, Ron Gonzalez wrote: > Sure let me give it a try. Any tips? I've only started looking at Spark > code more closely recently. > I can compare Spark-1.0.1 code and see what's going on... > > Thanks, > Ron > > > On Friday, August 8, 2014 10:43 AM, Reynold Xin > wrote: > > > I created a JIRA ticket to track this: > https://issues.apache.org/jira/browse/SPARK-2928 > > Let me know if you need help with it. > > > > On Fri, Aug 8, 2014 at 10:40 AM, Reynold Xin wrote: > > Yes, I'm pretty sure it doesn't actually use the right serializer in > TorrentBroadcast: > https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/broadcast/TorrentBroadcast.scala#L232 > > And TorrentBroadcast is turned on by default for 1.1 right now. Do you > want to submit a pull request to fix that? This would be a critical fix for > 1.1 that's worth doing. > > > > On Fri, Aug 8, 2014 at 10:37 AM, Ron Gonzalez > wrote: > > > Oops, exception is below. > > For local, it works and that's the case since TorrentBroadcast has if > !isLocal, then that's the only time the broadcast actually happens. It really > seems as if the Kryo wrapper didn't kick in for some reason. Do we have a > unit test that tests the Kryo serialization that I can give a try? > > Thanks, > > Ron > > > Exception in thread "Driver" java.lang.reflect.InvocationTargetException > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at > org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:180) > Caused by: java.io.NotSerializableException: > org.apache.avro.generic.GenericData$Record > - custom writeObject data (class "scala.collection.mutable.HashMap") > > > > On Friday, August 8, 2014 10:16 AM, Reynold Xin > wrote: > > > Looks like you didn't actually paste the exception message. Do you mind > doing that? > > > > On Fri, Aug 8, 2014 at 10:14 AM, Reynold Xin wrote: > > > Pasting a better formatted trace: > > > > > > > > at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) > > at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) > > at > > > scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137) > > at > > > scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135) > > at > > > scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) > > at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) > > at > > scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124) > > at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) > > at scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:606) at > > java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988) > > at > > java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495) > > at > > > java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) > > at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) > > at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) > > at org.apache.spark.util.Utils$.serialize(Utils.scala:64) > > at > > > org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232) > > at > > > org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85) > > at > > > org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:66) > > at > > > org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36) > > at > > > org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29) > > at > > > > > org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62) > > at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809) > > > > > > On Fri, Aug 8, 2014 at 10:12 AM, Ron Gonzalez < > > zlgonza...@yahoo.com.invalid> wrote: > > > >> Hi, > >> I have a running spark app against the released version of 1.0.1. I > >> recently decided to try and upgrade to the trunk version. Interestingly > >> enough, after building the 1.1.0-SNAPSHOT assembly, replacing it as my > >> assembly in my app caused errors. In particular, it seems Kryo > >> serialization isn't t
Re: 1.1.0-SNAPSHOT possible regression
Sure let me give it a try. Any tips? I've only started looking at Spark code more closely recently. I can compare Spark-1.0.1 code and see what's going on... Thanks, Ron On Friday, August 8, 2014 10:43 AM, Reynold Xin wrote: I created a JIRA ticket to track this: https://issues.apache.org/jira/browse/SPARK-2928 Let me know if you need help with it. On Fri, Aug 8, 2014 at 10:40 AM, Reynold Xin wrote: Yes, I'm pretty sure it doesn't actually use the right serializer in TorrentBroadcast: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/broadcast/TorrentBroadcast.scala#L232 > > >And TorrentBroadcast is turned on by default for 1.1 right now. Do you want to >submit a pull request to fix that? This would be a critical fix for 1.1 that's >worth doing. > > > > > >On Fri, Aug 8, 2014 at 10:37 AM, Ron Gonzalez wrote: > >Oops, exception is below. >>For local, it works and that's the case since TorrentBroadcast has if >>!isLocal, then that's the only time the broadcast actually happens. It really >>seems as if the Kryo wrapper didn't kick in for some reason. Do we have a >>unit test that tests the Kryo serialization that I can give a try? >>Thanks, >>Ron >> >> >>Exception in thread "Driver" java.lang.reflect.InvocationTargetException at >>sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at >>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >>at >>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:606) at >>org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:180) Caused by: java.io.NotSerializableException: org.apache.avro.generic.GenericData$Record - custom writeObject data (class "scala.collection.mutable.HashMap") >> >> >> >>On Friday, August 8, 2014 10:16 AM, Reynold Xin wrote: >> >> >> >>Looks like you didn't actually paste the exception message. Do you mind >>doing that? >> >> >> >> >>On Fri, Aug 8, 2014 at 10:14 AM, Reynold Xin wrote: >> >>> Pasting a better formatted trace: >>> >>> >>> >>> at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) >>> at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) >>> at >>> scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137) >>> at >>> scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135) >>> at >>> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) >>> at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) >>> at >>> scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124) >>> at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) >>> at scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>> at >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >>> at >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >>> at java.lang.reflect.Method.invoke(Method.java:606) at >>> java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988) >>> at >>> java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495) >>> at >>> java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) >>> at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) >>> at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) >>> at org.apache.spark.util.Utils$.serialize(Utils.scala:64) >>> at >>> org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232) >>> at >>> org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85) >>> at >>> org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:66) >>> at >>> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36) >>> at >>> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29) >>> at >>> >>> >>>org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62) >>> at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809) >>> >>> >>> On Fri, Aug 8, 2014 at 10:12 AM, Ron Gonzalez < >>> zlgonza...@yahoo.com.invalid> wrote: >>> Hi, I have a running spark app against the released version of 1.0.1. I recently decided to try and upgrade to the trunk version. Interestingly enough, after building the 1.1.0-SNAPSHOT assembly, replacing it as my assembly in my app caused errors. In particular, it seems Kryo serialization isn't taking. Replacing it with 1.0.1 automatically gets it working again. Any thoughts? Is this a known issue? Thanks, Ron at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) at java.io.ObjectOutp
Re: 1.1.0-SNAPSHOT possible regression
I created a JIRA ticket to track this: https://issues.apache.org/jira/browse/SPARK-2928 Let me know if you need help with it. On Fri, Aug 8, 2014 at 10:40 AM, Reynold Xin wrote: > Yes, I'm pretty sure it doesn't actually use the right serializer in > TorrentBroadcast: > https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/broadcast/TorrentBroadcast.scala#L232 > > And TorrentBroadcast is turned on by default for 1.1 right now. Do you > want to submit a pull request to fix that? This would be a critical fix for > 1.1 that's worth doing. > > > > On Fri, Aug 8, 2014 at 10:37 AM, Ron Gonzalez > wrote: > >> Oops, exception is below. >> >> For local, it works and that's the case since TorrentBroadcast has if >> !isLocal, then that's the only time the broadcast actually happens. It >> really seems as if the Kryo wrapper didn't kick in for some reason. Do we >> have a unit test that tests the Kryo serialization that I can give a try? >> >> Thanks, >> >> Ron >> >> >> Exception in thread "Driver" java.lang.reflect.InvocationTargetException >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:606) >> at >> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:180) >> Caused by: java.io.NotSerializableException: >> org.apache.avro.generic.GenericData$Record >> - custom writeObject data (class "scala.collection.mutable.HashMap") >> >> >> >> On Friday, August 8, 2014 10:16 AM, Reynold Xin >> wrote: >> >> >> Looks like you didn't actually paste the exception message. Do you mind >> doing that? >> >> >> >> On Fri, Aug 8, 2014 at 10:14 AM, Reynold Xin wrote: >> >> > Pasting a better formatted trace: >> > >> > >> > >> > at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) >> > at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) >> > at >> > >> scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137) >> > at >> > >> scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135) >> > at >> > >> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) >> > at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) >> > at >> > >> scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124) >> > at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) >> > at scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> > at >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> > at >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> > at java.lang.reflect.Method.invoke(Method.java:606) at >> > java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988) >> > at >> > java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495) >> > at >> > >> java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) >> > at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) >> > at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) >> > at org.apache.spark.util.Utils$.serialize(Utils.scala:64) >> > at >> > >> org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232) >> > at >> > >> org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85) >> > at >> > >> org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:66) >> > at >> > >> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36) >> > at >> > >> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29) >> > at >> > >> > >> org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62) >> > at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809) >> > >> > >> > On Fri, Aug 8, 2014 at 10:12 AM, Ron Gonzalez < >> > zlgonza...@yahoo.com.invalid> wrote: >> > >> >> Hi, >> >> I have a running spark app against the released version of 1.0.1. I >> >> recently decided to try and upgrade to the trunk version. Interestingly >> >> enough, after building the 1.1.0-SNAPSHOT assembly, replacing it as my >> >> assembly in my app caused errors. In particular, it seems Kryo >> >> serialization isn't taking. Replacing it with 1.0.1 automatically gets >> it >> >> working again. >> >> >> >> Any thoughts? Is this a known issue? >> >> >> >> Thanks, >> >> Ron >> >> >> >> at >> java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) >> >> at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) >> at >> >> >> scal
Re: 1.1.0-SNAPSHOT possible regression
Yes, I'm pretty sure it doesn't actually use the right serializer in TorrentBroadcast: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/broadcast/TorrentBroadcast.scala#L232 And TorrentBroadcast is turned on by default for 1.1 right now. Do you want to submit a pull request to fix that? This would be a critical fix for 1.1 that's worth doing. On Fri, Aug 8, 2014 at 10:37 AM, Ron Gonzalez wrote: > Oops, exception is below. > > For local, it works and that's the case since TorrentBroadcast has if > !isLocal, then that's the only time the broadcast actually happens. It really > seems as if the Kryo wrapper didn't kick in for some reason. Do we have a > unit test that tests the Kryo serialization that I can give a try? > > Thanks, > > Ron > > > Exception in thread "Driver" java.lang.reflect.InvocationTargetException > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at > org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:180) > Caused by: java.io.NotSerializableException: > org.apache.avro.generic.GenericData$Record > - custom writeObject data (class "scala.collection.mutable.HashMap") > > > > On Friday, August 8, 2014 10:16 AM, Reynold Xin > wrote: > > > Looks like you didn't actually paste the exception message. Do you mind > doing that? > > > > On Fri, Aug 8, 2014 at 10:14 AM, Reynold Xin wrote: > > > Pasting a better formatted trace: > > > > > > > > at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) > > at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) > > at > > > scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137) > > at > > > scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135) > > at > > > scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) > > at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) > > at > > scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124) > > at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) > > at scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:606) at > > java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988) > > at > > java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495) > > at > > > java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) > > at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) > > at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) > > at org.apache.spark.util.Utils$.serialize(Utils.scala:64) > > at > > > org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232) > > at > > > org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85) > > at > > > org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:66) > > at > > > org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36) > > at > > > org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29) > > at > > > > > org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62) > > at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809) > > > > > > On Fri, Aug 8, 2014 at 10:12 AM, Ron Gonzalez < > > zlgonza...@yahoo.com.invalid> wrote: > > > >> Hi, > >> I have a running spark app against the released version of 1.0.1. I > >> recently decided to try and upgrade to the trunk version. Interestingly > >> enough, after building the 1.1.0-SNAPSHOT assembly, replacing it as my > >> assembly in my app caused errors. In particular, it seems Kryo > >> serialization isn't taking. Replacing it with 1.0.1 automatically gets > it > >> working again. > >> > >> Any thoughts? Is this a known issue? > >> > >> Thanks, > >> Ron > >> > >> at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) > >> at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) > at > >> > scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137) > >> at > >> > scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135) > >> at > >> > scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) > >> at scala.collection.mutable.HashMap.foreachEntry(HashMap
Re: 1.1.0-SNAPSHOT possible regression
Oops, exception is below. For local, it works and that's the case since TorrentBroadcast has if !isLocal, then that's the only time the broadcast actually happens. It really seems as if the Kryo wrapper didn't kick in for some reason. Do we have a unit test that tests the Kryo serialization that I can give a try? Thanks, Ron Exception in thread "Driver" java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:180) Caused by: java.io.NotSerializableException: org.apache.avro.generic.GenericData$Record - custom writeObject data (class "scala.collection.mutable.HashMap") On Friday, August 8, 2014 10:16 AM, Reynold Xin wrote: Looks like you didn't actually paste the exception message. Do you mind doing that? On Fri, Aug 8, 2014 at 10:14 AM, Reynold Xin wrote: > Pasting a better formatted trace: > > > > at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) > at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) > at > scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137) > at > scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135) > at > scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) > at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) > at > scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124) > at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) > at scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) at > java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988) > at > java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495) > at > java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) > at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) > at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) > at org.apache.spark.util.Utils$.serialize(Utils.scala:64) > at > org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232) > at > org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85) > at > org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:66) > at > org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36) > at > org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29) > at > > >org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62) > at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809) > > > On Fri, Aug 8, 2014 at 10:12 AM, Ron Gonzalez < > zlgonza...@yahoo.com.invalid> wrote: > >> Hi, >> I have a running spark app against the released version of 1.0.1. I >> recently decided to try and upgrade to the trunk version. Interestingly >> enough, after building the 1.1.0-SNAPSHOT assembly, replacing it as my >> assembly in my app caused errors. In particular, it seems Kryo >> serialization isn't taking. Replacing it with 1.0.1 automatically gets it >> working again. >> >> Any thoughts? Is this a known issue? >> >> Thanks, >> Ron >> >> at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) >> at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at >> scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137) >> at >> scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135) >> at >> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) >> at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) at >> scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124) >> at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) at >> scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) at >> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:606) at >> java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988) >> at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495) >> at >> java.io.Ob
Re: 1.1.0-SNAPSHOT possible regression
Pasting a better formatted trace: at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137) at scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135) at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) at scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124) at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) at scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988) at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495) at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at org.apache.spark.util.Utils$.serialize(Utils.scala:64) at org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232) at org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85) at org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:66) at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36) at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29) at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62) at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809) On Fri, Aug 8, 2014 at 10:12 AM, Ron Gonzalez wrote: > Hi, > I have a running spark app against the released version of 1.0.1. I > recently decided to try and upgrade to the trunk version. Interestingly > enough, after building the 1.1.0-SNAPSHOT assembly, replacing it as my > assembly in my app caused errors. In particular, it seems Kryo > serialization isn't taking. Replacing it with 1.0.1 automatically gets it > working again. > > Any thoughts? Is this a known issue? > > Thanks, > Ron > > at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) > at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at > scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137) > at > scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135) > at > scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) > at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) at > scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124) > at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) at > scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) at > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) at > java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988) > at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495) > at > java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) > at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) at > java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at > org.apache.spark.util.Utils$.serialize(Utils.scala:64) at > org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232) > at > org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85) > at > org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:66) > at > org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36) > at > org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29) > at > > org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62) > at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809)
Re: 1.1.0-SNAPSHOT possible regression
Looks like you didn't actually paste the exception message. Do you mind doing that? On Fri, Aug 8, 2014 at 10:14 AM, Reynold Xin wrote: > Pasting a better formatted trace: > > > > at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) > at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) > at > scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137) > at > scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135) > at > scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) > at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) > at > scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124) > at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) > at scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) at > java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988) > at > java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495) > at > java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) > at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) > at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) > at org.apache.spark.util.Utils$.serialize(Utils.scala:64) > at > org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232) > at > org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85) > at > org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:66) > at > org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36) > at > org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29) > at > > > org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62) > at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809) > > > On Fri, Aug 8, 2014 at 10:12 AM, Ron Gonzalez < > zlgonza...@yahoo.com.invalid> wrote: > >> Hi, >> I have a running spark app against the released version of 1.0.1. I >> recently decided to try and upgrade to the trunk version. Interestingly >> enough, after building the 1.1.0-SNAPSHOT assembly, replacing it as my >> assembly in my app caused errors. In particular, it seems Kryo >> serialization isn't taking. Replacing it with 1.0.1 automatically gets it >> working again. >> >> Any thoughts? Is this a known issue? >> >> Thanks, >> Ron >> >> at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) >> at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at >> scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137) >> at >> scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135) >> at >> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) >> at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) at >> scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124) >> at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) at >> scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) at >> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:606) at >> java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988) >> at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495) >> at >> java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) >> at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) at >> java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at >> org.apache.spark.util.Utils$.serialize(Utils.scala:64) at >> org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232) >> at >> org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85) >> at >> org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:66) >> at >> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36) >> at >> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29) >> at >> >> org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62) >> at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809) > > >
1.1.0-SNAPSHOT possible regression
Hi, I have a running spark app against the released version of 1.0.1. I recently decided to try and upgrade to the trunk version. Interestingly enough, after building the 1.1.0-SNAPSHOT assembly, replacing it as my assembly in my app caused errors. In particular, it seems Kryo serialization isn't taking. Replacing it with 1.0.1 automatically gets it working again. Any thoughts? Is this a known issue? Thanks, Ron at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137) at scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135) at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) at scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124) at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) at scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988) at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495) at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at org.apache.spark.util.Utils$.serialize(Utils.scala:64) at org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232) at org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85) at org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:66) at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36) at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29) at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62) at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809)