What is your system setup? Can you paste the spark-env.sh? Looks like you
have some issues with your configuration.

Thanks
Best Regards

On Fri, Sep 12, 2014 at 6:31 PM, 남윤민 <rony...@dgist.ac.kr> wrote:

> I got this error from the executor's stderr:
>
>
>
>
>
> Using Spark's default log4j profile: 
> org/apache/spark/log4j-defaults.properties
> 14/09/12 21:53:36 INFO CoarseGrainedExecutorBackend: Registered signal 
> handlers for [TERM, HUP, INT]
> 14/09/12 21:53:36 WARN NativeCodeLoader: Unable to load native-hadoop library 
> for your platform... using builtin-java classes where applicable
> 14/09/12 21:53:36 INFO SecurityManager: Changing view acls to: root
> 14/09/12 21:53:36 INFO SecurityManager: Changing modify acls to: root
> 14/09/12 21:53:36 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(root); users 
> with modify permissions: Set(root)
> 14/09/12 21:53:36 INFO Slf4jLogger: Slf4jLogger started
> 14/09/12 21:53:36 INFO Remoting: Starting remoting
> 14/09/12 21:53:37 INFO Remoting: Remoting started; listening on addresses 
> :[akka.tcp://driverPropsFetcher@saturn09:35376]
> 14/09/12 21:53:37 INFO Remoting: Remoting now listens on addresses: 
> [akka.tcp://driverPropsFetcher@saturn09:35376]
> 14/09/12 21:53:37 INFO Utils: Successfully started service 
> 'driverPropsFetcher' on port 35376.
> 14/09/12 21:53:37 INFO SecurityManager: Changing view acls to: root
> 14/09/12 21:53:37 INFO SecurityManager: Changing modify acls to: root
> 14/09/12 21:53:37 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(root); users 
> with modify permissions: Set(root)
> 14/09/12 21:53:37 INFO RemoteActorRefProvider$RemotingTerminator: Shutting 
> down remote daemon.
> 14/09/12 21:53:37 INFO RemoteActorRefProvider$RemotingTerminator: Remote 
> daemon shut down; proceeding with flushing remote transports.
> 14/09/12 21:53:37 INFO Slf4jLogger: Slf4jLogger started
> 14/09/12 21:53:37 INFO Remoting: Starting remoting
> 14/09/12 21:53:37 INFO Remoting: Remoting shut down
> 14/09/12 21:53:37 INFO RemoteActorRefProvider$RemotingTerminator: Remoting 
> shut down.
> 14/09/12 21:53:37 INFO Remoting: Remoting started; listening on addresses 
> :[akka.tcp://sparkExecutor@saturn09:47076]
> 14/09/12 21:53:37 INFO Remoting: Remoting now listens on addresses: 
> [akka.tcp://sparkExecutor@saturn09:47076]
> 14/09/12 21:53:37 INFO Utils: Successfully started service 'sparkExecutor' on 
> port 47076.
> 14/09/12 21:53:37 INFO CoarseGrainedExecutorBackend: Connecting to driver: 
> akka.tcp://sparkDriver@saturn00:49464/user/CoarseGrainedScheduler
> 14/09/12 21:53:37 INFO WorkerWatcher: Connecting to worker 
> akka.tcp://sparkWorker@saturn09:43584/user/Worker
> 14/09/12 21:53:37 INFO WorkerWatcher: Successfully connected to 
> akka.tcp://sparkWorker@saturn09:43584/user/Worker
> 14/09/12 21:53:37 INFO CoarseGrainedExecutorBackend: Successfully registered 
> with driver
> 14/09/12 21:53:37 INFO SecurityManager: Changing view acls to: root
> 14/09/12 21:53:37 INFO SecurityManager: Changing modify acls to: root
> 14/09/12 21:53:37 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(root); users 
> with modify permissions: Set(root)
> 14/09/12 21:53:37 INFO Slf4jLogger: Slf4jLogger started
> 14/09/12 21:53:37 INFO Remoting: Starting remoting
> 14/09/12 21:53:37 INFO Remoting: Remoting started; listening on addresses 
> :[akka.tcp://sparkExecutor@saturn09:34812]
> 14/09/12 21:53:37 INFO Remoting: Remoting now listens on addresses: 
> [akka.tcp://sparkExecutor@saturn09:34812]
> 14/09/12 21:53:37 INFO Utils: Successfully started service 'sparkExecutor' on 
> port 34812.
> 14/09/12 21:53:37 INFO AkkaUtils: Connecting to MapOutputTracker: 
> akka.tcp://sparkDriver@saturn00:49464/user/MapOutputTracker
> 14/09/12 21:53:37 INFO CoarseGrainedExecutorBackend: Connecting to driver: 
> akka.tcp://sparkDriver@saturn00:49464/user/CoarseGrainedScheduler
> 14/09/12 21:53:37 ERROR OneForOneStrategy: Actor not found for: 
> ActorSelection[Actor[akka.tcp://sparkDriver@saturn00:49464/]/user/MapOutputTracker]
> akka.actor.ActorNotFound: Actor not found for: 
> ActorSelection[Actor[akka.tcp://sparkDriver@saturn00:49464/]/user/MapOutputTracker]
>       at 
> akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:66)
>       at 
> akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:64)
>       at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
>       at 
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
>       at 
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
>       at 
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
>       at 
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
>       at 
> scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
>       at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
>       at 
> akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
>       at 
> akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
>       at 
> akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
>       at 
> scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
>       at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:269)
>       at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:512)
>       at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:545)
>       at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:535)
>       at 
> akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:91)
>       at akka.actor.ActorRef.tell(ActorRef.scala:125)
>       at akka.dispatch.Mailboxes$$anon$1$$anon$2.enqueue(Mailboxes.scala:44)
>       at akka.dispatch.QueueBasedMessageQueue$class.cleanUp(Mailbox.scala:438)
>       at 
> akka.dispatch.UnboundedDequeBasedMailbox$MessageQueue.cleanUp(Mailbox.scala:650)
>       at akka.dispatch.Mailbox.cleanUp(Mailbox.scala:309)
>       at 
> akka.dispatch.MessageDispatcher.unregister(AbstractDispatcher.scala:204)
>       at akka.dispatch.MessageDispatcher.detach(AbstractDispatcher.scala:140)
>       at 
> akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:203)
>       at 
> akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:163)
>       at akka.actor.ActorCell.terminate(ActorCell.scala:338)
>       at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:431)
>       at akka.actor.ActorCell.systemInvoke(ActorCell.scala:447)
>       at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:262)
>       at akka.dispatch.Mailbox.run(Mailbox.scala:218)
>       at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
>       at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>       at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>       at 
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>       at 
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> 14/09/12 21:53:37 ERROR CoarseGrainedExecutorBackend: Driver Disassociated 
> [akka.tcp://sparkExecutor@saturn09:47076] -> 
> [akka.tcp://sparkDriver@saturn00:49464] disassociated! Shutting down.
>
>
>
> What is the reason of "Actor not found"?
>
>
>
>
>
>
>
>
>
> // *Yoonmin Nam*
>
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

Reply via email to