Anyone waiting for the docker images is going to have to wait until tomorrow, 
(or perhaps even Monday) as the build isn’t currently behaving itself after the 
split of airflow-core and the new meta package airflow

  #95 5.136 The conflict is caused by:
  #95 5.136     The user requested apache-airflow-core==3.0.0rc1.post1
  #95 5.136     apache-airflow 3.0.0rc1.post1 depends on 
apache-airflow-core==3.0.0.rc1

It’s a quirk of the RC naming, we’ll fix it and get the docker images build.

-ash

> On 3 Apr 2025, at 22:12, Vikram Koka <vik...@astronomer.io.INVALID> wrote:
> 
> Awesome!
> Thank you Kaxil for all your work and also thank you to all the
> contributors whose hard work and dedication made this release a reality.
> 
> Best regards,
> Vikram
> 
> 
> On Thu, Apr 3, 2025 at 2:08 PM Kaxil Naik <kaxiln...@gmail.com> wrote:
> 
>> Docker images will be out soon too.
>> 
>> On Fri, 4 Apr 2025 at 02:35, Kaxil Naik <kaxiln...@gmail.com> wrote:
>> 
>>> Hey fellow Airflowers,
>>> 
>>> I am thrilled to announce the availability of Apache Airflow 3.0.0rc1 &
>> *Task
>>> SDK 1.0.0rc1* for testing! Airflow 3.0 marks a significant milestone as
>>> the first major release in over four years, introducing improvements that
>>> enhance user experience, task execution, and system scalability.
>>> 
>>> This email is calling for a vote on the release,
>>> which will last at least 7 days until 10th April.
>>> and until 3 binding +1 votes have been received.
>>> 
>>> Consider this my (non-binding) +1.
>>> 
>>> Airflow 3.0.0rc1 is available at:
>>> https://dist.apache.org/repos/dist/dev/airflow/3.0.0rc1/
>>> 
>>> 
>>> "apache-airflow" Meta package:
>>> 
>>> 
>>>   - *apache-airflow-3.0.0-source.tar.gz* is a source release that comes
>>>   with INSTALL instructions.
>>>   - *apache-airflow-3.0.0.tar.gz* is the binary Python "sdist" release.
>>>   - *apache_airflow-3.0.0-py3-none-any.whl* is the binary Python
>>>   wheel "binary" release.
>>> 
>>> "apache-airflow-core" package
>>> 
>>> 
>>>   - *apache_airflow_core-3.0.0.tar.gz* is the binary Python "sdist"
>>>   release.
>>>   - *apache_airflow_3.0.0-py3-none-any.whl* is the binary Python
>>>   wheel "binary" release.
>>> 
>>> 
>>> Task SDK 1.0.0rc1 is available at:
>>> https://dist.apache.org/repos/dist/dev/airflow/task-sdk/1.0.0rc1/
>>> 
>>> 
>>> "apache-airflow-task-sdk" package
>>> 
>>>   - *apache-airflow-task-sdk-1.0.0-source.tar.gz* is a source release
>>>   - *apache_airflow_task_sdk-1.0.0.tar.gz* is the binary Python "sdist"
>>>   release.
>>>   - *apache_airflow_task_sdk-1.0.0-py3-none-any.whl* is the binary
>>>   Python wheel "binary" release.
>>> 
>>> 
>>> 
>>> Public keys are available at:
>>> https://dist.apache.org/repos/dist/release/airflow/KEYS
>>> 
>>> Please vote accordingly:
>>> 
>>> [ ] +1 approve
>>> [ ] +0 no opinion
>>> [ ] -1 disapprove with the reason
>>> 
>>> Only votes from PMC members are binding, but all members of the community
>>> are encouraged to test the release and vote with "(non-binding)".
>>> 
>>> The test procedure for PMC members is described in:
>>> 
>>> 
>> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-pmc-members
>>> 
>>> The test procedure for contributors and members of the community who
>> would
>>> like to test this RC is described in:
>>> 
>>> 
>> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-contributors
>>> 
>>> Please note that the version number excludes the 'rcX' string, so it's
>> now
>>> simply 3.0.0 for Airflow package and 1.0.0 for Task SDK. This will allow
>>> us to rename the artifact without modifying
>>> the artifact checksums when we actually release.
>>> 
>>> Release Notes:
>>> https://github.com/apache/airflow/blob/3.0.0rc1/RELEASE_NOTES.rst
>>> 
>>> 
>>> *Testing Instructions using PyPI*:
>>> 
>>> You can build a virtualenv that installs this, and other required
>> packages
>>> (e.g. task sdk), like this:
>>> 
>>> ```
>>> 
>>> uv venv
>>> 
>>> uv pip install apache-airflow apache-airflow-providers-standard==0.3.0rc1
>>> --pre
>>> 
>>> ```
>>> 
>>> Get Involved
>>> 
>>> We encourage the community to test this release and report any issues or
>>> feedback. Your contributions help us ensure a stable and reliable Airflow
>>> 3.0.0 release. Please report issues using Github at
>>> https://github.com/apache/airflow/issues and mark that this is an issue
>>> in 3.0.0. For an updated list of all known issues in the beta can also be
>>> found in the above link with the label “affected_version:3.0.0rc”
>>> 
>>> A huge thank you to all the contributors who have worked on this
>> milestone
>>> release!
>>> Best,
>>> Kaxil
>>> 
>>> ---
>>> What's new in 3.0.0?
>>> 
>>> Notable Features
>>> 
>>> DAG versioning & Bundles
>>> 
>>> Airflow now tracks DAG versions, offering better visibility into
>>> historical DAG changes and execution states. The introduction of DAG
>>> Bundles ensures tasks run with the correct code version, even as DAGs
>>> evolve.
>>> 
>>> Modern Web Application
>>> 
>>> The UI has been rebuilt using React and a complete API-driven structure,
>>> improving maintainability and extensibility. It includes a new
>>> component-based design system and an enhanced information architecture. A
>>> new React-based plugin system supports custom widgets, improved workflow
>>> visibility, and integration with external tools.
>>> 
>>> Task Execution Interface
>>> 
>>> Airflow 3.0 adopts a client / server architecture, decoupling task
>>> execution from the internal meta-database via API-based interaction. This
>>> allows for remote execution across networks, multi-language support,
>>> enhanced security, and better dependency management. The Edge Executor
>>> further enables seamless remote task execution without direct database
>>> connections.
>>> 
>>> Data Assets & Asset-Centric Syntax
>>> 
>>> Airflow 3.0 enhances dataset management by introducing Data Assets,
>>> expanding beyond tables and files to include ML models and more. Assets
>> can
>>> be explicitly defined using the @asset decorator, simplifying tracking
>> and
>>> dependencies.
>>> 
>>> External Event-Driven Scheduling
>>> 
>>> Airflow now supports event-driven DAG triggers from external sources like
>>> message queues and blob stores. This builds upon dataset scheduling and
>>> enhances integration with the external data ecosystem.
>>> 
>>> 
>> 


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@airflow.apache.org
For additional commands, e-mail: dev-h...@airflow.apache.org

Reply via email to