[ 
https://issues.apache.org/jira/browse/SPARK-4377?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Patrick Wendell resolved SPARK-4377.
------------------------------------
    Resolution: Fixed

> ZooKeeperPersistenceEngine: java.lang.IllegalStateException: Trying to 
> deserialize a serialized ActorRef without an ActorSystem in scope.
> -----------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-4377
>                 URL: https://issues.apache.org/jira/browse/SPARK-4377
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy, Spark Core
>    Affects Versions: 1.3.0
>            Reporter: Josh Rosen
>            Assignee: Prashant Sharma
>            Priority: Blocker
>             Fix For: 1.3.0
>
>
> It looks like ZooKeeperPersistenceEngine is broken in the current Spark 
> master (23f5bdf06a388e08ea5a69e848f0ecd5165aa481).  Here's a log excerpt from 
> a secondary master when it takes over from a failed primary master:
> {code}
> 14/11/13 04:37:12 WARN ConnectionStateManager: There are no 
> ConnectionStateListeners registered.
> 14/11/13 04:37:19 INFO Master: Registering worker 172.17.0.223:8888 with 8 
> cores, 984.0 MB RAM
> 14/11/13 04:37:20 INFO Master: Registering worker 172.17.0.224:8888 with 8 
> cores, 984.0 MB RAM
> 14/11/13 04:37:35 INFO Master: Registering worker 172.17.0.223:8888 with 8 
> cores, 984.0 MB RAM
> 14/11/13 04:37:35 INFO Master: Registering worker 172.17.0.224:8888 with 8 
> cores, 984.0 MB RAM
> 14/11/13 04:37:43 INFO Master: Registering worker 172.17.0.224:8888 with 8 
> cores, 984.0 MB RAM
> 14/11/13 04:37:47 INFO Master: Registering worker 172.17.0.223:8888 with 8 
> cores, 984.0 MB RAM
> 14/11/13 04:37:51 INFO Master: Registering worker 172.17.0.224:8888 with 8 
> cores, 984.0 MB RAM
> 14/11/13 04:37:59 INFO Master: Registering worker 172.17.0.223:8888 with 8 
> cores, 984.0 MB RAM
> 14/11/13 04:37:59 INFO Master: Registering worker 172.17.0.224:8888 with 8 
> cores, 984.0 MB RAM
> 14/11/13 04:38:06 INFO ZooKeeperLeaderElectionAgent: We have gained leadership
> 14/11/13 04:38:06 WARN ZooKeeperPersistenceEngine: Exception while reading 
> persisted file, deleting
> java.io.IOException: java.lang.IllegalStateException: Trying to deserialize a 
> serialized ActorRef without an ActorSystem in scope. Use 
> 'akka.serialization.Serialization.currentSystem.withValue(system) { ... }'
>       at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:988)
>       at 
> org.apache.spark.deploy.master.ApplicationInfo.readObject(ApplicationInfo.scala:51)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:81)
>       at 
> org.apache.spark.deploy.master.ZooKeeperPersistenceEngine.deserializeFromFile(ZooKeeperPersistenceEngine.scala:69)
>       at 
> org.apache.spark.deploy.master.ZooKeeperPersistenceEngine$$anonfun$read$1.apply(ZooKeeperPersistenceEngine.scala:54)
>       at 
> org.apache.spark.deploy.master.ZooKeeperPersistenceEngine$$anonfun$read$1.apply(ZooKeeperPersistenceEngine.scala:54)
>       at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>       at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>       at 
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>       at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>       at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>       at scala.collection.AbstractTraversable.map(Traversable.scala:105)
>       at 
> org.apache.spark.deploy.master.ZooKeeperPersistenceEngine.read(ZooKeeperPersistenceEngine.scala:54)
>       at 
> org.apache.spark.deploy.master.ZooKeeperPersistenceEngine.read(ZooKeeperPersistenceEngine.scala:32)
>       at 
> org.apache.spark.deploy.master.PersistenceEngine$class.readPersistedData(PersistenceEngine.scala:84)
>       at 
> org.apache.spark.deploy.master.ZooKeeperPersistenceEngine.readPersistedData(ZooKeeperPersistenceEngine.scala:32)
>       at 
> org.apache.spark.deploy.master.Master$$anonfun$receiveWithLogging$1.applyOrElse(Master.scala:181)
>       at 
> scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
>       at 
> scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
>       at 
> scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
>       at 
> org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53)
>       at 
> org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)
>       at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
>       at 
> org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42)
>       at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
>       at org.apache.spark.deploy.master.Master.aroundReceive(Master.scala:48)
>       at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
>       at akka.actor.ActorCell.invoke(ActorCell.scala:487)
>       at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
>       at akka.dispatch.Mailbox.run(Mailbox.scala:220)
>       at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>       at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>       at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>       at 
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>       at 
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> Caused by: java.lang.IllegalStateException: Trying to deserialize a 
> serialized ActorRef without an ActorSystem in scope. Use 
> 'akka.serialization.Serialization.currentSystem.withValue(system) { ... }'
>       at akka.actor.SerializedActorRef.readResolve(ActorRef.scala:407)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1104)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1807)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>       at 
> java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
>       at 
> org.apache.spark.deploy.master.ApplicationInfo$$anonfun$readObject$1.apply$mcV$sp(ApplicationInfo.scala:52)
>       at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:985)
>       ... 44 more
> {code}
> I found this by running my custom Spark integration tests framework, which 
> has a test that roughly corresponds to Apache Spark's {{FaultToleranceTest}}, 
> specifically the "single-master-halt" test.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to