Re: [VOTE] Airflow Providers prepared on July 12, 2023

2023-07-12 Thread Ephraim Anierobi
+1 (binding) Checked expected files, licenses, signatures & SHA512 

On 2023/07/12 19:36:33 Elad Kalif wrote:
> Hey all,
> 
> 
> I have just cut the new wave Airflow Providers packages. This email is
> calling a vote on the release,
> 
> which will last for 72 hours - which means that it will end on July 15,
> 2023 07:35 PM UTC and until 3 binding +1 votes have been received.
> 
> 
> 
> Consider this my (binding) +1.
> 
> 
> Airflow Providers are available at:
> 
> https://dist.apache.org/repos/dist/dev/airflow/providers/
> 
> 
> *apache-airflow-providers--*.tar.gz* are the binary
> 
>  Python "sdist" release - they are also official "sources" for the provider
> packages.
> 
> 
> *apache_airflow_providers_-*.whl are the binary
> 
>  Python "wheel" release.
> 
> 
> The test procedure for PMC members who would like to test the RC candidates
> are described in
> 
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md\#verify-the-release-by-pmc-members
> 
> 
> and for Contributors:
> 
> 
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md\#verify-by-contributors
> 
> 
> 
> Public keys are available at:
> 
> https://dist.apache.org/repos/dist/release/airflow/KEYS
> 
> 
> Please vote accordingly:
> 
> 
> [ ] +1 approve
> 
> [ ] +0 no opinion
> 
> [ ] -1 disapprove with the reason
> 
> 
> 
> Only votes from PMC members are binding, but members of the community are
> 
> encouraged to test the release and vote with "(non-binding)".
> 
> 
> Please note that the version number excludes the 'rcX' string.
> 
> This will allow us to rename the artifact without modifying
> 
> the artifact checksums when we actually release.
> 
> 
> The status of testing the providers by the community is kept here:
> 
> 
> https://github.com/apache/airflow/issues/32568
> 
> 
> You can find packages as well as detailed changelog following the below
> links:
> 
> 
> https://pypi.org/project/apache-airflow-providers-amazon/8.3.1rc1/
> https://pypi.org/project/apache-airflow-providers-cncf-kubernetes/7.3.0rc1/
> https://pypi.org/project/apache-airflow-providers-google/10.4.0rc1/
> https://pypi.org/project/apache-airflow-providers-http/4.5.0rc1/
> https://pypi.org/project/apache-airflow-providers-microsoft-azure/6.2.1rc1/
> https://pypi.org/project/apache-airflow-providers-sftp/4.4.0rc1/
> https://pypi.org/project/apache-airflow-providers-snowflake/4.3.1rc1/
> 
> 
> Cheers,
> 
> Elad Kalif
> 

-
To unsubscribe, e-mail: dev-unsubscr...@airflow.apache.org
For additional commands, e-mail: dev-h...@airflow.apache.org



[VOTE] Airflow Providers prepared on July 12, 2023

2023-07-12 Thread Elad Kalif
Hey all,


I have just cut the new wave Airflow Providers packages. This email is
calling a vote on the release,

which will last for 72 hours - which means that it will end on July 15,
2023 07:35 PM UTC and until 3 binding +1 votes have been received.



Consider this my (binding) +1.


Airflow Providers are available at:

https://dist.apache.org/repos/dist/dev/airflow/providers/


*apache-airflow-providers--*.tar.gz* are the binary

 Python "sdist" release - they are also official "sources" for the provider
packages.


*apache_airflow_providers_-*.whl are the binary

 Python "wheel" release.


The test procedure for PMC members who would like to test the RC candidates
are described in

https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md\#verify-the-release-by-pmc-members


and for Contributors:


https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md\#verify-by-contributors



Public keys are available at:

https://dist.apache.org/repos/dist/release/airflow/KEYS


Please vote accordingly:


[ ] +1 approve

[ ] +0 no opinion

[ ] -1 disapprove with the reason



Only votes from PMC members are binding, but members of the community are

encouraged to test the release and vote with "(non-binding)".


Please note that the version number excludes the 'rcX' string.

This will allow us to rename the artifact without modifying

the artifact checksums when we actually release.


The status of testing the providers by the community is kept here:


https://github.com/apache/airflow/issues/32568


You can find packages as well as detailed changelog following the below
links:


https://pypi.org/project/apache-airflow-providers-amazon/8.3.1rc1/
https://pypi.org/project/apache-airflow-providers-cncf-kubernetes/7.3.0rc1/
https://pypi.org/project/apache-airflow-providers-google/10.4.0rc1/
https://pypi.org/project/apache-airflow-providers-http/4.5.0rc1/
https://pypi.org/project/apache-airflow-providers-microsoft-azure/6.2.1rc1/
https://pypi.org/project/apache-airflow-providers-sftp/4.4.0rc1/
https://pypi.org/project/apache-airflow-providers-snowflake/4.3.1rc1/


Cheers,

Elad Kalif


Re: [DISCUSS] Moving Dask Executor to a separate (optional?) dask provider

2023-07-12 Thread Oliveira, Niko
I think in a perfect world we'd only have the completely vendor neutral 
executors pre-installed (Local, Sequential, Debug) and anything else would need 
to be specifically installed by admins/users. I think if we were starting from 
scratch this would make the most sense, but clearly Kubernetes and Celery 
executors are so ubiquitous that it'd cause too much wreckage to not install 
them, but I'd like to push for Dask to _not_ be installed by default. If this 
causes too much wreckage then perhaps we should deprecate that (though I'm not 
sure exactly what that would look like in this context), but it's difficult to 
measure how many folks are using the Dask executor. Perhaps we have data from 
the yearly questionnaire/survey we send?


From: Jarek Potiuk 
Sent: Wednesday, July 12, 2023 8:05:54 AM
To: dev@airflow.apache.org
Subject: [EXTERNAL] [DISCUSS] Moving Dask Executor to a separate (optional?) 
dask provider

CAUTION: This email originated from outside of the organization. Do not click 
links or open attachments unless you can confirm the sender and know the 
content is safe.



Hello Everyone,

A small follow up after K8S/Celery executors being moved:
https://lists.apache.org/thread/7gyw7ty9vm0pokjxq7y3b1zw6mrlxfm8

We are in the process of moving Celery / Kubernetes executor (Celery almost
complete and I am working on K8S next + some common discovery and config
moving)

But there is one more "questionable" executor - i.e. Dask executor, still
living in Airflow Core.

When it comes to Celery/Kubernetes, we decided to make the two providers
preinstalled, because it makes most sense  - we are also going to get the
basic documentation in the "core" airflow documentation so that it is
easier discoverable and prominently visible - also because of the
vendor-neutrality.

However when it comes to Dask I am not sure about its status and whether we
should make it preinstalled ?

I guess there is no doubt to move it to a provider - this has only the
benefits same as Celery/K8S move. But whether it should be preinstalled
with Airflow - I am not sure. I do not know how frequently Dask executor
(and Dask) is used by people using Airflow, but I personally do not think
it should be as "closely" connected with Airflow as Celery/Kubernetes ones.

If we do not make it preinstalled, it is somewhat (but not too much,
really) breaking change. We still might choose to install dask provider in
the PROD reference image, so it will continue to work if you use the image,
and when you are installing airflow in venv you will only have to specify
`pip install apache-airflow[dask]` or manually install
`apache-airflow-providers-daskexecutor` (for now at least this is the name
I could reserve in PyPI). So this is not really breaking, it just requires
another dependency to be installed. But some pipelines of installing
Airflow might get broken because it won't be pre-installed - so this is a
borderline breaking.

WDYT? Should we make the dask executor pre-installed or not?

J.