[ 
https://issues.apache.org/jira/browse/SPARK-8941?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14623226#comment-14623226
 ] 

Jesper Lundgren commented on SPARK-8941:
----------------------------------------

Maybe it is better to close this issue and open a new one for the API change 
and the documentation issues.

I'll probably try to review some of the issues we had with the stand alone 
cluster and see if I should create JIRA tickets for some of them.

ex, when using supervised mode in HA cluster, there is not a well documented 
procedure to force stop and disable restart of a driver (in case the driver 
exits with the wrong exit code). I know of the "kill" command
bin/spark-class org.apache.spark.deploy.Client kill
But in my experience it does not always work.



> Standalone cluster worker does not accept multiple masters on launch
> --------------------------------------------------------------------
>
>                 Key: SPARK-8941
>                 URL: https://issues.apache.org/jira/browse/SPARK-8941
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy, Documentation
>    Affects Versions: 1.4.0, 1.4.1
>            Reporter: Jesper Lundgren
>            Priority: Critical
>
> Before 1.4 it was possible to launch a worker node using a comma separated 
> list of master nodes. 
> ex:
> sbin/start-slave.sh 1 "spark://localhost:7077,localhost:7078"
> starting org.apache.spark.deploy.worker.Worker, logging to 
> /Users/jesper/Downloads/spark-1.4.0-bin-cdh4/sbin/../logs/spark-jesper-org.apache.spark.deploy.worker.Worker-1-Jespers-MacBook-Air.local.out
> failed to launch org.apache.spark.deploy.worker.Worker:
>                              Default is conf/spark-defaults.conf.
>   15/07/09 12:33:06 INFO Utils: Shutdown hook called
> Spark 1.2 and 1.3.1 accepts multiple masters in this format.
> update: start-slave.sh only expects master lists in 1.4 (no instance number)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to