As explained before I am cancelling the vote for rc1 and will send a vote
for rc2 in a moment.
J.
On Mon, May 18, 2020 at 3:02 PM Jarek Potiuk
wrote:
> Yep. Good point. It has no chance to work. I will remove it from release
> list.
>
> J.
>
> On Mon, May 18, 2020 at 2:40 PM Ash Berlin-Taylor
Yep. Good point. It has no chance to work. I will remove it from release
list.
J.
On Mon, May 18, 2020 at 2:40 PM Ash Berlin-Taylor wrote:
> Updating my vote to a hard -1 -- the cncf-kubernetes operator doesn't
> work as we need to special case that one to pull in more files (that or
> we _dont
Updating my vote to a hard -1 -- the cncf-kubernetes operator doesn't
work as we need to special case that one to pull in more files (that or
we _dont_ release that one as a backport which I am less fond of):
```
In [1]: import
airflow.providers.cncf.kubernetes.operators.kubernetes_pod
Thanks Ash for all the comments and for testing/reviewing :). Glad to have
someone looking over such a big release. I would love others to take a
look as well.
I already fixed the -py2-py3 problem (universal flag in setup.cfg replaced
with python-tag=py3)
I think none of the problems are "critic
On May 18 2020, at 11:25 am, Jarek Potiuk wrote:
>>
>>
>>
>>Back-ported apache-airflow-backport-providers-snowflake package for
>> Airflow 1.10.*
>>
>> I think it should say
>>
>>Back-ported apache-airflow-providers-snowflake package for Airflow
>> 1.10.*
>>
>> Why no backports ?
On Mon, May 18, 2020 at 12:22 PM Ash Berlin-Taylor wrote:
> I think it would also be nice to state prominently on the project
> description on PyPi, for example on
> https://pypi.org/project/apache-airflow-backport-providers-snowflake/2020.5.20rc1/#description
>
Sure I can update that :). That's
>
>
>
>Back-ported apache-airflow-backport-providers-snowflake package for
> Airflow 1.10.*
>
> I think it should say
>
>Back-ported apache-airflow-providers-snowflake package for Airflow
> 1.10.*
>
> Why no backports ? I believe we agreed (between you Kaxil and myself:
https://lists.apache
I think it would also be nice to state prominently on the project
description on PyPi, for example on
https://pypi.org/project/apache-airflow-backport-providers-snowflake/2020.5.20rc1/#description
*Only Python 3.6+ is supported for this backport package.* Airflow
1.10 continues to support Pytho
Other things:
The project description on PyPi says:
Back-ported apache-airflow-backport-providers-snowflake package for
Airflow 1.10.*
I think it should say
Back-ported apache-airflow-providers-snowflake package for Airflow 1.10.*
We don't publish `-src` archives for these releases, and
>
>
>
>
> -1 (but recoverable without cancelling the vote)
>
> There are no wheel files in the artifacts uploaded to apache SVN (which
> per the ASF rules is strictly what we vote upon/release). PyPi is just
> how we all install it.
>
> Please upload the .whl files to
>
> https://dist.apache.org/re
Nice one Jarek, looking forward to having this!
However:
-1 (but recoverable without cancelling the vote)
There are no wheel files in the artifacts uploaded to apache SVN (which
per the ASF rules is strictly what we vote upon/release). PyPi is just
how we all install it.
Please upload the .whl
Hey all,
I have cut Airflow Backport Providers 2020.5.20rc1.
This email is calling a vote on the release, which will last for 72 hours -
which means that it will end on 2020.05.21 1:00 am CEST.
Consider this my (binding) +1.
Airflow Backport Providers 2020.5.20rc1 are available at:
https://dist
12 matches
Mail list logo