[ 
https://issues.apache.org/jira/browse/SPARK-1307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14169831#comment-14169831
 ] 

Apache Spark commented on SPARK-1307:
-------------------------------------

User 'srowen' has created a pull request for this issue:
https://github.com/apache/spark/pull/2787

> don't use term 'standalone' to refer to a Spark Application
> -----------------------------------------------------------
>
>                 Key: SPARK-1307
>                 URL: https://issues.apache.org/jira/browse/SPARK-1307
>             Project: Spark
>          Issue Type: Bug
>          Components: Documentation
>    Affects Versions: 0.9.0
>            Reporter: Diana Carroll
>
> In the "Quick Start Guide" for Scala, there are three sections entitled "A 
> Standalone App in Scala", "A Standalone App in Java" and "A Standalone App in 
> Python."  
> In these sections, the word "standalone" is meant to refer to a Spark 
> application that is run outside of the Spark Shell. This nomenclature is 
> quite confusing, because the cluster management framework included in Spark 
> is called "Spark Standalone"...this overlap of terms has resulted in at least 
> one person (me) think that a "standalone app" was somehow related to a 
> "standalone cluster"...and that in order to run my app on a Standalone Spark 
> cluster, I had to write follow the instructions to write a Standalone app.
> Fortunately, the only place I can find this usage of "standalone" to refer to 
> an application is on the Quick Start page.   I think those three sections 
> should instead be retitled as
> Writing a Spark Application in Scala
> Writing a Spark Application in Java
> Writing a Spark Application in Python
> and rephrased to remove the use of the term "standalone".



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to