[ 
https://issues.apache.org/jira/browse/SPARK-26807?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-26807:
------------------------------
    Priority: Trivial  (was: Minor)

I'll just fix it, to expedite.

> Confusing documentation regarding installation from PyPi
> --------------------------------------------------------
>
>                 Key: SPARK-26807
>                 URL: https://issues.apache.org/jira/browse/SPARK-26807
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation
>    Affects Versions: 2.4.0
>            Reporter: Emmanuel Arias
>            Priority: Trivial
>
> Hello!
> I am new using Spark. Reading the documentation I think that is a little 
> confusing on Downloading section.
> [ttps://spark.apache.org/docs/latest/#downloading|https://spark.apache.org/docs/latest/#downloading]
>  write: "Scala and Java users can include Spark in their projects using its 
> Maven coordinates and in the future Python users can also install Spark from 
> PyPI.", I interpret that currently Spark is not on PyPi yet. But  
> [https://spark.apache.org/downloads.html] write: 
> "[PySpark|https://pypi.python.org/pypi/pyspark] is now available in pypi. To 
> install just run {{pip install pyspark}}."



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to