Hi, Holden,

Based on the PR, https://github.com/pypa/packaging-problems/issues/90 , the
limit has been increased to 250MB.

Just wondering if we can publish PySpark to PyPI now? Have you created the
account?

Thanks,

Xiao Li



2017-05-12 11:35 GMT-07:00 Sameer Agarwal <sam...@databricks.com>:

> Holden,
>
> Thanks again for pushing this forward! Out of curiosity, did we get an
> approval from the PyPi folks?
>
> Regards,
> Sameer
>
> On Mon, May 8, 2017 at 11:44 PM, Holden Karau <hol...@pigscanfly.ca>
> wrote:
>
>> So I have a PR to add this to the release process documentation - I'm
>> waiting on the necessary approvals from PyPi folks before I merge that
>> incase anything changes as a result of the discussion (like uploading to
>> the legacy host or something). As for conda-forge, it's not something we
>> need to do, but I'll add a note about pinging them when we make a new
>> release so their users can keep up to date easily. The parent JIRA for PyPi
>> related tasks is SPARK-18267 :)
>>
>>
>> On Mon, May 8, 2017 at 6:22 PM cloud0fan <cloud0...@gmail.com> wrote:
>>
>>> Hi Holden,
>>>
>>> Thanks for working on it! Do we have a JIRA ticket to track this? We
>>> should
>>> make it part of the release process in all the following Spark releases,
>>> and
>>> it will be great if we have a JIRA ticket to record the detailed steps of
>>> doing this and even automate it.
>>>
>>> Thanks,
>>> Wenchen
>>>
>>>
>>>
>>> --
>>> View this message in context: http://apache-spark-developers
>>> -list.1001551.n3.nabble.com/Uploading-PySpark-2-1-1-to-PyPi-
>>> tp21531p21532.html
>>> Sent from the Apache Spark Developers List mailing list archive at
>>> Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>
>>>
>
>
> --
> Sameer Agarwal
> Software Engineer | Databricks Inc.
> http://cs.berkeley.edu/~sameerag
>

Reply via email to