[ 
https://issues.apache.org/jira/browse/SPARK-8395?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Or closed SPARK-8395.
----------------------------
          Resolution: Fixed
       Fix Version/s: 1.5.0
                      1.4.1
            Assignee: Sean Owen
    Target Version/s: 1.4.1, 1.5.0

> spark-submit documentation is incorrect
> ---------------------------------------
>
>                 Key: SPARK-8395
>                 URL: https://issues.apache.org/jira/browse/SPARK-8395
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation
>    Affects Versions: 1.4.0
>            Reporter: Dev Lakhani
>            Assignee: Sean Owen
>            Priority: Minor
>             Fix For: 1.4.1, 1.5.0
>
>
> Using a fresh checkout of 1.4.0-bin-hadoop2.6
> if you run 
> ./start-slave.sh  1 spark://localhost:7077
> you get
> failed to launch org.apache.spark.deploy.worker.Worker:
>                              Default is conf/spark-defaults.conf.
>   15/06/16 13:11:08 INFO Utils: Shutdown hook called
> it seems the worker number is not being accepted  as desccribed here:
> https://spark.apache.org/docs/latest/spark-standalone.html
> The documentation says:
> ./sbin/start-slave.sh <worker#> <master-spark-URL>
> but the start.slave-sh script states:
> usage="Usage: start-slave.sh <spark-master-URL> where <spark-master-URL> is 
> like spark://localhost:7077"
> I have checked for similar issues using :
> https://issues.apache.org/jira/browse/SPARK-6552?jql=text%20~%20%22start-slave%22
> and found nothing similar so am raising this as an issue.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to