Re: Actor not found

2015-04-18 Thread Zhihang Fan
Hi, Shixiong:
 Actually, I know nothing about this exception. I submitted a job that
would read about 2.5T data and it threw this exception. And also I try to
submit some jobs that can run successfully before this submission, it also
failed with the same exception.
Hope this will help you to do the troubleshooting.
PS: spark 1.3.0 hadoop version: 2.3.0-cdh5.1.0.

All my best,
Zhihang Fan

2015-04-17 16:59 GMT+08:00 Shixiong Zhu zsxw...@gmail.com:

 Forgot this one: I cannot find any issue about creating
 OutputCommitCoordinator. The order of creating OutputCommitCoordinato looks
 right.

 Best Regards,
 Shixiong(Ryan) Zhu

 2015-04-17 16:57 GMT+08:00 Shixiong Zhu zsxw...@gmail.com:

 I just checked the codes about creating OutputCommitCoordinator. Could
 you reproduce this issue? If so, could you provide details about how to
 reproduce it?

 Best Regards,
 Shixiong(Ryan) Zhu

 2015-04-16 13:27 GMT+08:00 Canoe canoe...@gmail.com:

 13119 Exception in thread main akka.actor.ActorNotFound: Actor not
 found
 for: ActorSelection[Anchor(akka.tcp://sparkdri...@dmslave13.et2.tbsi
 te.net:5908/), Path(/user/OutputCommitCoordinator)]
 13120 at

 akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65)
 13121 at

 akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63)
 13122 at
 scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
 13123 at

 akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
 13124 at

 akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
 13125 at

 akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
 13126 at

 akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
 13127 at
 scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
 13128 at
 akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
 13129 at

 akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
 13130 at
 akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
 13131 at

 akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
 13132 at
 scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
 13133 at

 scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
 13134 at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:267)
 13135 at
 akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:508)
 13136 at
 akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:541)
 13137 at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:531)
 13138 at

 akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:87)
 13139 at
 akka.remote.EndpointManager$$anonfun$1.applyOrElse(Remoting.scala:575)
 13140 at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
 13141 at
 akka.remote.EndpointManager.aroundReceive(Remoting.scala:395)
 13142 at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
 13143 at akka.actor.ActorCell.invoke(ActorCell.scala:487)
 13144 at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
 13145 at akka.dispatch.Mailbox.run(Mailbox.scala:220)
 13146 at

 akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
 13147 at
 scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
 13148 at

 scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
 13149 at
 scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
 13150 at

 scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)


 I met the same problem when I run spark on yarn. Is this a bug or what ?




 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Actor-not-found-tp22265p22508.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org






-- 
谁谓河广,一苇航之


Re: Can Spark 1.0.2 run on CDH-4.3.0 with yarn? And Will Spark 1.2.0 support CDH5.1.2 with yarn?

2014-12-18 Thread Zhihang Fan
Hi, Sean
   Thank you for your reply. I will try to use Spark 1.1 and 1.2 on CHD5.X.
 :)


2014-12-18 17:38 GMT+08:00 Sean Owen so...@cloudera.com:

 The question is really: will Spark 1.1 work with a particular version
 of YARN? many, but not all versions of YARN are supported. The
 stable versions are (2.2.x+). Before that, support is patchier, and
 in fact has been removed in Spark 1.3.

 The yarn profile supports YARN stable which is about 2.2.x and
 onwards. The yarn-alpha profile should work for YARN about 0.23.x.
 2.0.x and 2.1.x were a sort of beta period and I recall that
 yarn-alpha works with some of it, but not all, and there is no
 yarn-beta profile.

 I believe early CDH 4.x has basically YARN beta. Later 4.x has
 stable. I think I'd try the yarn-alpha profile and see if it compiles.
 But the version of YARN in that release may well be among those that
 fall in the gap between alpha and stable support.

 Thankfully things got a lot more stable past Hadoop / YARN 2.2 or so,
 so it far more just works without version issues. And CDH 5 is based
 on Hadoop 2.3 and then 2.5, so you should be much more able to build
 whatever versions together that you want.

 CDH 5.1.x ships Spark 1.0.x. There should be no problem using 1.1.x,
 1.2.x, etc. with it; you just need to make and support your own
 binaries. 5.2.x has 1.1.x; 5.3.x will have 1.2.x.

 On Thu, Dec 18, 2014 at 9:18 AM, Canoe canoe...@gmail.com wrote:
  I did not compile spark 1.1.0 source code on CDH4.3.0 with yarn
 successfully.
  Does it support CDH4.3.0 with yarn ?
  And will spark 1.2.0 support CDH5.1.2?
 
 
 
  --
  View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Can-Spark-1-0-2-run-on-CDH-4-3-0-with-yarn-And-Will-Spark-1-2-0-support-CDH5-1-2-with-yarn-tp20760.html
  Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
  -
  To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
  For additional commands, e-mail: user-h...@spark.apache.org
 



-- 
谁谓河广,一苇航之