They should be - in the sense that the docs now recommend using
spark-submit and thus include entirely different invocations.


On Fri, May 30, 2014 at 12:46 AM, Reynold Xin <r...@databricks.com> wrote:

> Can you take a look at the latest Spark 1.0 docs and see if they are fixed?
>
> https://github.com/apache/spark/tree/master/docs
>
> Thanks.
>
>
> On Thu, May 29, 2014 at 5:29 AM, Lizhengbing (bing, BIPA) <
> zhengbing...@huawei.com> wrote:
>
> > The instruction address is in
> >
> http://spark.apache.org/docs/0.9.0/spark-standalone.html#launching-applications-inside-the-cluster
> > or
> >
> http://spark.apache.org/docs/0.9.1/spark-standalone.html#launching-applications-inside-the-cluster
> >
> > Origin instruction is:
> > "./bin/spark-class org.apache.spark.deploy.Client launch
> >    [client-options] \
> >    <cluster-url> <application-jar-url> <main-class> \
> >    [application-options] "
> >
> > If I follow this instruction, I will not run my program deployed in a
> > spark standalone cluster properly.
> >
> > Based on source code, This instruction should be changed to
> > "./bin/spark-class org.apache.spark.deploy.Client [client-options]
> launch \
> >    <cluster-url> <application-jar-url> <main-class> \
> >    [application-options] "
> >
> > That is to say: [client-options] must be put ahead of launch
> >
>

Reply via email to