[ https://issues.apache.org/jira/browse/SPARK-6443?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Andrew Or updated SPARK-6443: ----------------------------- Summary: Support HA in standalone cluster modehen HA is enabled (was: Could not submit app in standalone cluster mode when HA is enabled) > Support HA in standalone cluster modehen HA is enabled > ------------------------------------------------------ > > Key: SPARK-6443 > URL: https://issues.apache.org/jira/browse/SPARK-6443 > Project: Spark > Issue Type: Bug > Components: Spark Submit > Affects Versions: 1.0.0 > Reporter: Tao Wang > Priority: Critical > > After digging some codes, I found user could not submit app in standalone > cluster mode when HA is enabled. But in client mode it can work. > Haven't try yet. But I will verify this and file a PR to resolve it if the > problem exists. > 3/23 update: > I started a HA cluster with zk, and tried to submit SparkPi example with > command: > ./spark-submit --class org.apache.spark.examples.SparkPi --master > spark://doggie153:7077,doggie159:7077 --deploy-mode cluster > ../lib/spark-examples-1.2.0-hadoop2.4.0.jar > and it failed with error message: > Spark assembly has been built with Hive, including Datanucleus jars on > classpath > 15/03/23 15:24:45 ERROR actor.OneForOneStrategy: Invalid master URL: > spark://doggie153:7077,doggie159:7077 > akka.actor.ActorInitializationException: exception during creation > at akka.actor.ActorInitializationException$.apply(Actor.scala:164) > at akka.actor.ActorCell.create(ActorCell.scala:596) > at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:456) > at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478) > at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263) > at akka.dispatch.Mailbox.run(Mailbox.scala:219) > at > akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) > at > scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) > at > scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) > at > scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > at > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > Caused by: org.apache.spark.SparkException: Invalid master URL: > spark://doggie153:7077,doggie159:7077 > at org.apache.spark.deploy.master.Master$.toAkkaUrl(Master.scala:830) > at org.apache.spark.deploy.ClientActor.preStart(Client.scala:42) > at akka.actor.Actor$class.aroundPreStart(Actor.scala:470) > at org.apache.spark.deploy.ClientActor.aroundPreStart(Client.scala:35) > at akka.actor.ActorCell.create(ActorCell.scala:580) > ... 9 more > But in client mode it ended with correct result. So my guess is right. I will > fix it in the related PR. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org