Thanks for the log. It's really helpful. I created a JIRA to explain why it will happen: https://issues.apache.org/jira/browse/SPARK-6640
However, will this error always happens in your environment? Best Regards, Shixiong Zhu 2015-03-31 22:36 GMT+08:00 sparkdi <shopaddr1...@dubna.us>: > This is the whole output from the shell: > > ~/spark-1.3.0-bin-hadoop2.4$ sudo bin/spark-shell > Spark assembly has been built with Hive, including Datanucleus jars on > classpath > log4j:WARN No appenders could be found for logger > (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). > log4j:WARN Please initialize the log4j system properly. > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for > more info. > Using Spark's default log4j profile: > org/apache/spark/log4j-defaults.properties > 15/03/30 19:00:40 INFO SecurityManager: Changing view acls to: root > 15/03/30 19:00:40 INFO SecurityManager: Changing modify acls to: root > 15/03/30 19:00:40 INFO SecurityManager: SecurityManager: authentication > disabled; ui acls disabled; users with view pe > rmissions: Set(root); users with modify permissions: Set(root) > 15/03/30 19:00:40 INFO HttpServer: Starting HTTP Server > 15/03/30 19:00:40 INFO Server: jetty-8.y.z-SNAPSHOT > 15/03/30 19:00:40 INFO AbstractConnector: Started > SocketConnector@0.0.0.0:47797 > 15/03/30 19:00:40 INFO Utils: Successfully started service 'HTTP class > server' on port 47797. > Welcome to > ____ __ > / __/__ ___ _____/ /__ > _\ \/ _ \/ _ `/ __/ '_/ > /___/ .__/\_,_/_/ /_/\_\ version 1.3.0 > /_/ > > Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_75) > Type in expressions to have them evaluated. > Type :help for more information. > 15/03/30 19:00:42 INFO SparkContext: Running Spark version 1.3.0 > 15/03/30 19:00:42 INFO SecurityManager: Changing view acls to: root > 15/03/30 19:00:42 INFO SecurityManager: Changing modify acls to: root > 15/03/30 19:00:42 INFO SecurityManager: SecurityManager: authentication > disabled; ui acls disabled; users with view pe > rmissions: Set(root); users with modify permissions: Set(root) > 15/03/30 19:00:42 INFO Slf4jLogger: Slf4jLogger started > 15/03/30 19:00:42 INFO Remoting: Starting remoting > 15/03/30 19:00:43 INFO Remoting: Remoting started; listening on addresses > :[akka.tcp://sparkDriver@vm:52574] > 15/03/30 19:00:43 INFO Utils: Successfully started service 'sparkDriver' on > port 52574. > 15/03/30 19:00:43 INFO SparkEnv: Registering MapOutputTracker > 15/03/30 19:00:43 INFO SparkEnv: Registering BlockManagerMaster > 15/03/30 19:00:43 INFO DiskBlockManager: Created local directory at > /tmp/spark-f71a8d86-6e49-4dfe-bb98-8e8581015acc/bl > ockmgr-57532f5a-38db-4ba3-86d8-edef84f592e5 > 15/03/30 19:00:43 INFO MemoryStore: MemoryStore started with capacity 265.4 > MB > 15/03/30 19:00:43 INFO HttpFileServer: HTTP File server directory is > /tmp/spark-95e0a143-0de3-4c96-861c-968c9fae2746/h > ttpd-cb029cd6-4943-479d-9b56-e7397489d9ea > 15/03/30 19:00:43 INFO HttpServer: Starting HTTP Server > 15/03/30 19:00:43 INFO Server: jetty-8.y.z-SNAPSHOT > 15/03/30 19:00:43 INFO AbstractConnector: Started > SocketConnector@0.0.0.0:48500 > 15/03/30 19:00:43 INFO Utils: Successfully started service 'HTTP file > server' on port 48500. > 15/03/30 19:00:43 INFO SparkEnv: Registering OutputCommitCoordinator > 15/03/30 19:00:43 INFO Server: jetty-8.y.z-SNAPSHOT > 15/03/30 19:00:43 INFO AbstractConnector: Started > SelectChannelConnector@0.0.0.0:4040 > 15/03/30 19:00:43 INFO Utils: Successfully started service 'SparkUI' on > port > 4040. > 15/03/30 19:00:43 INFO SparkUI: Started SparkUI at http://vm:4040 > 15/03/30 19:00:43 INFO Executor: Starting executor ID <driver> on host > localhost > 15/03/30 19:00:43 INFO Executor: Using REPL class URI: > http://10.11.204.80:47797 > 15/03/30 19:00:43 INFO AkkaUtils: Connecting to HeartbeatReceiver: > akka.tcp://sparkDriver@vm:5 > 2574/user/HeartbeatReceiver > 15/03/30 19:00:43 ERROR OneForOneStrategy: Actor not found for: > ActorSelection[Anchor(akka://sparkDriver/deadLetters), > Path(/)] > akka.actor.ActorInitializationException: exception during creation > at akka.actor.ActorInitializationException$.apply(Actor.scala:164) > at akka.actor.ActorCell.create(ActorCell.scala:596) > at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:456) > at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478) > at > akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263) > at akka.dispatch.Mailbox.run(Mailbox.scala:219) > at > > akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) > at > scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) > at > > scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) > at > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > at > > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > Caused by: akka.actor.ActorNotFound: Actor not found for: > ActorSelection[Anchor(akka://sparkDriver/deadLetters), Path( > /)] > at > > akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65) > at > > akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63) > at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) > at > > akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67) > at > > akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82) > at > > akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59) > at > > akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59) > at > scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72) > at > akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58) > at > > akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74) > at > akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110) > at > > akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73) > at > scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40) > at > > scala.concurrent.impl.Promise$DefaultPromise.scala$concurrent$impl$Promise$DefaultPromise$$dispatchOrAddCal > lback(Promise.scala:280) > at > scala.concurrent.impl.Promise$DefaultPromise.onComplete(Promise.scala:270) > at akka.actor.ActorSelection.resolveOne(ActorSelection.scala:63) > at akka.actor.ActorSelection.resolveOne(ActorSelection.scala:80) > at > org.apache.spark.util.AkkaUtils$.makeDriverRef(AkkaUtils.scala:221) > at > > org.apache.spark.executor.Executor.startDriverHeartbeater(Executor.scala:393) > at org.apache.spark.executor.Executor.<init>(Executor.scala:119) > at > org.apache.spark.scheduler.local.LocalActor.<init>(LocalBackend.scala:58) > at > > org.apache.spark.scheduler.local.LocalBackend$$anonfun$start$1.apply(LocalBackend.scala:107) > at > > org.apache.spark.scheduler.local.LocalBackend$$anonfun$start$1.apply(LocalBackend.scala:107) > at akka.actor.TypedCreatorFunctionConsumer.produce(Props.scala:343) > at akka.actor.Props.newActor(Props.scala:252) > at akka.actor.ActorCell.newActor(ActorCell.scala:552) > at akka.actor.ActorCell.create(ActorCell.scala:578) > ... 9 more > 15/03/30 19:00:43 INFO NettyBlockTransferService: Server created on 58205 > 15/03/30 19:00:43 INFO BlockManagerMaster: Trying to register BlockManager > 15/03/30 19:00:43 INFO BlockManagerMasterActor: Registering block manager > localhost:58205 with 265.4 MB RAM, BlockMana > gerId(<driver>, localhost, 58205) > 15/03/30 19:00:43 INFO BlockManagerMaster: Registered BlockManager > 15/03/30 19:00:43 INFO SparkILoop: Created spark context.. > Spark context available as sc. > 15/03/30 19:00:43 INFO SparkILoop: Created sql context (with Hive > support).. > SQL context available as sqlContext. > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Actor-not-found-tp22265p22324.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >