[jira] [Commented] (SPARK-22406) pyspark version tag is wrong on PyPi

2018-01-05 Thread holdenk (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16314233#comment-16314233
 ] 

holdenk commented on SPARK-22406:
-

Yes. I'll close this.

> pyspark version tag is wrong on PyPi
> 
>
> Key: SPARK-22406
> URL: https://issues.apache.org/jira/browse/SPARK-22406
> Project: Spark
>  Issue Type: Bug
>  Components: PySpark
>Affects Versions: 2.2.0
>Reporter: Kerrick Staley
>Assignee: holdenk
>Priority: Minor
>
> On pypi.python.org, the pyspark package is tagged with version 
> {{2.2.0.post0}}: https://pypi.python.org/pypi/pyspark/2.2.0
> However, when you install the package, it has version {{2.2.0}}.
> This has really annoying consequences: if you try {{pip install 
> pyspark==2.2.0}}, it won't work. Instead you have to do {{pip install 
> pyspark==2.2.0.post0}}. Then, if you later run the same command ({{pip 
> install pyspark==2.2.0.post0}}), it won't recognize the existing pyspark 
> installation (because it has version {{2.2.0}}) and instead will reinstall 
> it, which is very slow because pyspark is a large package.
> This can happen if you add a new package to a {{requirements.txt}} file; you 
> end up waiting a lot longer than necessary because every time you run {{pip 
> install -r requirements.txt}} it reinstalls pyspark.
> Can you please change the package on PyPi to have the version {{2.2.0}}?



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22406) pyspark version tag is wrong on PyPi

2018-01-05 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16314106#comment-16314106
 ] 

Sean Owen commented on SPARK-22406:
---

[~holden.ka...@gmail.com] [~felixcheung] is this bit actually done now?

> pyspark version tag is wrong on PyPi
> 
>
> Key: SPARK-22406
> URL: https://issues.apache.org/jira/browse/SPARK-22406
> Project: Spark
>  Issue Type: Bug
>  Components: PySpark
>Affects Versions: 2.2.0
>Reporter: Kerrick Staley
>Assignee: holdenk
>Priority: Minor
>
> On pypi.python.org, the pyspark package is tagged with version 
> {{2.2.0.post0}}: https://pypi.python.org/pypi/pyspark/2.2.0
> However, when you install the package, it has version {{2.2.0}}.
> This has really annoying consequences: if you try {{pip install 
> pyspark==2.2.0}}, it won't work. Instead you have to do {{pip install 
> pyspark==2.2.0.post0}}. Then, if you later run the same command ({{pip 
> install pyspark==2.2.0.post0}}), it won't recognize the existing pyspark 
> installation (because it has version {{2.2.0}}) and instead will reinstall 
> it, which is very slow because pyspark is a large package.
> This can happen if you add a new package to a {{requirements.txt}} file; you 
> end up waiting a lot longer than necessary because every time you run {{pip 
> install -r requirements.txt}} it reinstalls pyspark.
> Can you please change the package on PyPi to have the version {{2.2.0}}?



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22406) pyspark version tag is wrong on PyPi

2017-11-11 Thread Holden Karau (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16248405#comment-16248405
 ] 

Holden Karau commented on SPARK-22406:
--

Yes, although this should be fixed in the documented upload process, it
just has to be run at the end of the release to be verified closed.

On Fri, Nov 10, 2017 at 10:40 PM Felix Cheung (JIRA) 

-- 
Cell : 425-233-8271


> pyspark version tag is wrong on PyPi
> 
>
> Key: SPARK-22406
> URL: https://issues.apache.org/jira/browse/SPARK-22406
> Project: Spark
>  Issue Type: Bug
>  Components: PySpark
>Affects Versions: 2.2.0
>Reporter: Kerrick Staley
>Assignee: holdenk
>Priority: Minor
>
> On pypi.python.org, the pyspark package is tagged with version 
> {{2.2.0.post0}}: https://pypi.python.org/pypi/pyspark/2.2.0
> However, when you install the package, it has version {{2.2.0}}.
> This has really annoying consequences: if you try {{pip install 
> pyspark==2.2.0}}, it won't work. Instead you have to do {{pip install 
> pyspark==2.2.0.post0}}. Then, if you later run the same command ({{pip 
> install pyspark==2.2.0.post0}}), it won't recognize the existing pyspark 
> installation (because it has version {{2.2.0}}) and instead will reinstall 
> it, which is very slow because pyspark is a large package.
> This can happen if you add a new package to a {{requirements.txt}} file; you 
> end up waiting a lot longer than necessary because every time you run {{pip 
> install -r requirements.txt}} it reinstalls pyspark.
> Can you please change the package on PyPi to have the version {{2.2.0}}?



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22406) pyspark version tag is wrong on PyPi

2017-11-10 Thread Felix Cheung (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16248364#comment-16248364
 ] 

Felix Cheung commented on SPARK-22406:
--

is this still being targeted for 2.2.1?

> pyspark version tag is wrong on PyPi
> 
>
> Key: SPARK-22406
> URL: https://issues.apache.org/jira/browse/SPARK-22406
> Project: Spark
>  Issue Type: Bug
>  Components: PySpark
>Affects Versions: 2.2.0
>Reporter: Kerrick Staley
>Assignee: holdenk
>Priority: Minor
>
> On pypi.python.org, the pyspark package is tagged with version 
> {{2.2.0.post0}}: https://pypi.python.org/pypi/pyspark/2.2.0
> However, when you install the package, it has version {{2.2.0}}.
> This has really annoying consequences: if you try {{pip install 
> pyspark==2.2.0}}, it won't work. Instead you have to do {{pip install 
> pyspark==2.2.0.post0}}. Then, if you later run the same command ({{pip 
> install pyspark==2.2.0.post0}}), it won't recognize the existing pyspark 
> installation (because it has version {{2.2.0}}) and instead will reinstall 
> it, which is very slow because pyspark is a large package.
> This can happen if you add a new package to a {{requirements.txt}} file; you 
> end up waiting a lot longer than necessary because every time you run {{pip 
> install -r requirements.txt}} it reinstalls pyspark.
> Can you please change the package on PyPi to have the version {{2.2.0}}?



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22406) pyspark version tag is wrong on PyPi

2017-11-05 Thread holdenk (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16239432#comment-16239432
 ] 

holdenk commented on SPARK-22406:
-

Due to restrictions on PyPI, no. We can try and fix this in 2.2.1 however.

> pyspark version tag is wrong on PyPi
> 
>
> Key: SPARK-22406
> URL: https://issues.apache.org/jira/browse/SPARK-22406
> Project: Spark
>  Issue Type: Bug
>  Components: PySpark
>Affects Versions: 2.2.0
>Reporter: Kerrick Staley
>Priority: Minor
>
> On pypi.python.org, the pyspark package is tagged with version 
> {{2.2.0.post0}}: https://pypi.python.org/pypi/pyspark/2.2.0
> However, when you install the package, it has version {{2.2.0}}.
> This has really annoying consequences: if you try {{pip install 
> pyspark==2.2.0}}, it won't work. Instead you have to do {{pip install 
> pyspark==2.2.0.post0}}. Then, if you later run the same command ({{pip 
> install pyspark==2.2.0.post0}}), it won't recognize the existing pyspark 
> installation (because it has version {{2.2.0}}) and instead will reinstall 
> it, which is very slow because pyspark is a large package.
> This can happen if you add a new package to a {{requirements.txt}} file; you 
> end up waiting a lot longer than necessary because every time you run {{pip 
> install -r requirements.txt}} it reinstalls pyspark.
> Can you please change the package on PyPi to have the version {{2.2.0}}?



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22406) pyspark version tag is wrong on PyPi

2017-11-01 Thread Liang-Chi Hsieh (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16233700#comment-16233700
 ] 

Liang-Chi Hsieh commented on SPARK-22406:
-

cc [~holdenk]

> pyspark version tag is wrong on PyPi
> 
>
> Key: SPARK-22406
> URL: https://issues.apache.org/jira/browse/SPARK-22406
> Project: Spark
>  Issue Type: Bug
>  Components: PySpark
>Affects Versions: 2.2.0
>Reporter: Kerrick Staley
>Priority: Minor
>
> On pypi.python.org, the pyspark package is tagged with version 
> {{2.2.0.post0}}: https://pypi.python.org/pypi/pyspark/2.2.0
> However, when you install the package, it has version {{2.2.0}}.
> This has really annoying consequences: if you try {{pip install 
> pyspark==2.2.0}}, it won't work. Instead you have to do {{pip install 
> pyspark==2.2.0.post0}}. Then, if you later run the same command ({{pip 
> install pyspark==2.2.0.post0}}), it won't recognize the existing pyspark 
> installation (because it has version {{2.2.0}}) and instead will reinstall 
> it, which is very slow because pyspark is a large package.
> This can happen if you add a new package to a {{requirements.txt}} file; you 
> end up waiting a lot longer than necessary because every time you run {{pip 
> install -r requirements.txt}} it reinstalls pyspark.
> Can you please change the package on PyPi to have the version {{2.2.0}}?



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org