Didn't we want to upload 2.1.1 too? What is the local version string problem?
Matei
> On May 26, 2017, at 10:11 AM, Xiao Li wrote:
>
> Hi, Holden,
>
> That sounds good to me!
>
> Thanks,
>
> Xiao
>
> 2017-05-23 16:32 GMT-07:00 Holden Karau
Hi, Holden,
That sounds good to me!
Thanks,
Xiao
2017-05-23 16:32 GMT-07:00 Holden Karau :
> An account already exists, the PMC has the info for it. I think we will
> need to wait for the 2.2 artifacts to do the actual PyPI upload because of
> the local version string in
An account already exists, the PMC has the info for it. I think we will
need to wait for the 2.2 artifacts to do the actual PyPI upload because of
the local version string in 2.2.1, but rest assured this isn't something
I've lost track of.
On Wed, May 24, 2017 at 12:11 AM Xiao Li
Hi, Holden,
Based on the PR, https://github.com/pypa/packaging-problems/issues/90 , the
limit has been increased to 250MB.
Just wondering if we can publish PySpark to PyPI now? Have you created the
account?
Thanks,
Xiao Li
2017-05-12 11:35 GMT-07:00 Sameer Agarwal :
Holden,
Thanks again for pushing this forward! Out of curiosity, did we get an
approval from the PyPi folks?
Regards,
Sameer
On Mon, May 8, 2017 at 11:44 PM, Holden Karau wrote:
> So I have a PR to add this to the release process documentation - I'm
> waiting on the
So I have a PR to add this to the release process documentation - I'm
waiting on the necessary approvals from PyPi folks before I merge that
incase anything changes as a result of the discussion (like uploading to
the legacy host or something). As for conda-forge, it's not something we
need to do,
Hi Holden,
Thanks for working on it! Do we have a JIRA ticket to track this? We should
make it part of the release process in all the following Spark releases, and
it will be great if we have a JIRA ticket to record the detailed steps of
doing this and even automate it.
Thanks,
Wenchen
--