Hi All,When I run my application, it runs for a while and give me part of the 
o/p correctly. I then get the following error and it then spark shell exits. 
14/07/07 13:54:53 INFO SendingConnection: Initiating connection to 
[localhost.localdomain/127.0.0.1:57423]14/07/07 13:54:53 INFO 
ConnectionManager: Accepted connection from 
[localhost.localdomain/127.0.0.1]14/07/07 13:54:53 INFO SendingConnection: 
Connected to [localhost.localdomain/127.0.0.1:57423], 2 messages 
pending14/07/07 13:54:53 INFO BlockManager: Removing block 
taskresult_1414/07/07 13:54:53 INFO BlockManager: Removing block 
taskresult_1314/07/07 13:54:53 INFO MemoryStore: Block taskresult_14 of size 
12174859 dropped from memory (free 296532358)14/07/07 13:54:53 INFO 
MemoryStore: Block taskresult_13 of size 12115603 dropped from memory (free 
308647961)14/07/07 13:54:53 INFO BlockManagerInfo: Removed taskresult_14 on 
pzxnvm2018.dcld.pldc.kp.org:50924 in memory (size: 11.6 MB, free: 282.9 
MB)14/07/07 13:54:53 INFO BlockManagerMaster: Updated info of block 
taskresult_1414/07/07 13:54:53 INFO BlockManagerInfo: Removed taskresult_13 on 
pzxnvm2018.a.b.org:50924 in memory (size: 11.6 MB, free: 294.4 MB)14/07/07 
13:54:53 INFO BlockManagerMaster: Updated info of block taskresult_1314/07/07 
13:54:54 INFO TaskSetManager: Finished TID 13 in 3043 ms on localhost 
(progress: 1/2)14/07/07 13:54:54 INFO DAGScheduler: Completed ResultTask(7, 
0)14/07/07 13:54:54 INFO TaskSchedulerImpl: Removed TaskSet 7.0, whose tasks 
have all completed, from pool14/07/07 13:54:54 INFO DAGScheduler: Failed to run 
collect at JaccardScore.scala:8414/07/07 13:54:54 INFO TaskSchedulerImpl: 
Cancelling stage 7org.apache.spark.SparkException: Job aborted due to stage 
failure: Exception while deserializing and fetching task: 
com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 
13994      at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1033)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1017)
 at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1015)
 at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)  
     at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)   at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1015)  at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:633)
 at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:633)
 at scala.Option.foreach(Option.scala:236)       at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:633)
  at 
org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1207)
     at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)     at 
akka.actor.ActorCell.invoke(ActorCell.scala:456)     at 
akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)      at 
akka.dispatch.Mailbox.run(Mailbox.scala:219) at 
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
       at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
     at 
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)     at 
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

scala> 14/07/07 13:54:57 INFO AppClient$ClientActor: Connecting to master 
spark://pzxnvm2018:7077...14/07/07 13:55:17 INFO AppClient$ClientActor: 
Connecting to master spark://pzxnvm2018:7077...14/07/07 13:55:37 ERROR 
SparkDeploySchedulerBackend: Application has been killed. Reason: All masters 
are unresponsive! Giving up.14/07/07 13:55:37 ERROR TaskSchedulerImpl: Exiting 
due to error from cluster scheduler: All masters are unresponsive! Giving up.
Logs for the Master node:14/07/07 13:51:55 INFO Master: 
akka.tcp://spark@localhost:45063 got disassociated, removing it.14/07/07 
13:54:38 ERROR EndpointWriter: dropping message [class 
akka.actor.SelectChildName] for non-local recipient 
[Actor[akka.tcp://sparkMaster@pzxnvm2018:7077/]] arriving at 
[akka.tcp://sparkMaster@pzxnvm2018:7077] inbound addresses are 
[akka.tcp://sparkmas...@pzxnvm2018.a.b.org:7077]14/07/07 13:54:57 ERROR 
EndpointWriter: dropping message [class akka.actor.SelectChildName] for 
non-local recipient [Actor[akka.tcp://sparkMaster@pzxnvm2018:7077/]] arriving 
at [akka.tcp://sparkMaster@pzxnvm2018:7077] inbound addresses are 
[akka.tcp://sparkmas...@pzxnvm2018.a.b.org:7077]14/07/07 13:55:17 ERROR 
EndpointWriter: dropping message [class akka.actor.SelectChildName] for 
non-local recipient [Actor[akka.tcp://sparkMaster@pzxnvm2018:7077/]] arriving 
at [akka.tcp://sparkMaster@pzxnvm2018:7077] inbound addresses are 
[akka.tcp://sparkmas...@pzxnvm2018.a.b.org:7077]14/07/07 13:55:38 INFO Master: 
akka.tcp://spark@localhost:38739 got disassociated, removing it.14/07/07 
13:55:38 INFO Master: akka.tcp://spark@localhost:38739 got disassociated, 
removing it.14/07/07 13:55:38 INFO LocalActorRef: Message 
[akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from 
Actor[akka://sparkMaster/deadLetters] to 
Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40172.16.48.41%3A49030-9#-767579404]
 was not delivered. [2] dead letters encountered. This logging can be turned 
off or adjusted with configuration settings 'akka.log-dead-letters' and 
'akka.log-dead-letters-during-shutdown'.14/07/07 13:55:38 ERROR EndpointWriter: 
AssociationError [akka.tcp://sparkmas...@pzxnvm2018.a.b.org:7077] -> 
[akka.tcp://spark@localhost:38739]: Error [Association failed with 
[akka.tcp://spark@localhost:38739]] [akka.remote.EndpointAssociationException: 
Association failed with [akka.tcp://spark@localhost:38739]Caused by: 
akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: 
Connection refused: localhost/127.0.0.1:38739]14/07/07 13:55:38 INFO Master: 
akka.tcp://spark@localhost:38739 got disassociated, removing it.14/07/07 
13:55:38 INFO Master: akka.tcp://spark@localhost:38739 got disassociated, 
removing it.14/07/07 13:55:38 ERROR EndpointWriter: AssociationError 
[akka.tcp://sparkmas...@pzxnvm2018.a.b.org:7077] -> 
[akka.tcp://spark@localhost:38739]: Error [Association failed with 
[akka.tcp://spark@localhost:38739]] [akka.remote.EndpointAssociationException: 
Association failed with [akka.tcp://spark@localhost:38739]Caused by: 
akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: 
Connection refused: localhost/127.0.0.1:38739]14/07/07 13:55:38 ERROR 
EndpointWriter: AssociationError 
[akka.tcp://sparkmas...@pzxnvm2018.a.b.org:7077] -> 
[akka.tcp://spark@localhost:38739]: Error [Association failed with 
[akka.tcp://spark@localhost:38739]] [akka.remote.EndpointAssociationException: 
Association failed with [akka.tcp://spark@localhost:38739]Caused by: 
akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: 
Connection refused: localhost/127.0.0.1:38739]14/07/07 13:55:38 INFO Master: 
akka.tcp://spark@localhost:38739 got disassociated, removing it.
Master deployment logs:
14/07/07 13:44:41 INFO LocalActorRef: Message 
[akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from 
Actor[akka://sparkMaster/deadLetters] to 
Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40172.16.48.41%3A48763-33#-1703494807]
 was not delivered. [8] dead letters encountered. This logging can be turned 
off or adjusted with configuration settings 'akka.log-dead-letters' and 
'akka.log-dead-letters-during-shutdown'.14/07/07 13:44:41 INFO Master: 
akka.tcp://spark@localhost:47058 got disassociated, removing it.14/07/07 
13:44:41 ERROR EndpointWriter: AssociationError 
[akka.tcp://sparkmas...@pzxnvm2018.a.b.org:7077] -> 
[akka.tcp://spark@localhost:47058]: Error [Association failed with 
[akka.tcp://spark@localhost:47058]] [akka.remote.EndpointAssociationException: 
Association failed with [akka.tcp://spark@localhost:47058]Caused by: 
akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: 
Connection refused: localhost/127.0.0.1:47058]14/07/07 13:44:41 INFO Master: 
akka.tcp://spark@localhost:47058 got disassociated, removing it.14/07/07 
13:44:41 ERROR EndpointWriter: AssociationError 
[akka.tcp://sparkmas...@pzxnvm2018.a.b.org:7077] -> 
[akka.tcp://spark@localhost:47058]: Error [Association failed with 
[akka.tcp://spark@localhost:47058]] [akka.remote.EndpointAssociationException: 
Association failed with [akka.tcp://spark@localhost:47058]Caused by: 
akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: 
Connection refused: localhost/127.0.0.1:47058]14/07/07 13:44:41 ERROR 
EndpointWriter: AssociationError 
[akka.tcp://sparkmas...@pzxnvm2018.a.b.org:7077] -> 
[akka.tcp://spark@localhost:47058]: Error [Association failed with 
[akka.tcp://spark@localhost:47058]] [akka.remote.EndpointAssociationException: 
Association failed with [akka.tcp://spark@localhost:47058]Caused by: 
akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: 
Connection refused: localhost/127.0.0.1:47058]14/07/07 13:44:41 INFO Master: 
akka.tcp://spark@localhost:47058 got disassociated, removing it.
                                          

Reply via email to