[ANNOUNCE] Apache Airflow Helm Chart version 1.14.0 Released

2024-06-18 Thread Jed Cunningham
Dear Airflow community,

I am pleased to announce that we have released Apache Airflow Helm chart
1.14.0  

The source release, as well as the "binary" Helm Chart release, are
available:

   Official Sources:
https://airflow.apache.org/docs/helm-chart/1.14.0/installing-helm-chart-from-sources.html
   ArtifactHub:
https://artifacthub.io/packages/helm/apache-airflow/airflow
   Docs: https://airflow.apache.org/docs/helm-chart/1.14.0/
   Quick Start Installation Guide:
https://airflow.apache.org/docs/helm-chart/1.14.0/quick-start.html
️Release Notes:
https://airflow.apache.org/docs/helm-chart/1.14.0/release_notes.html

Thanks to all the contributors who made this possible.

Thanks,
Jed


[RESULT][VOTE] Release Apache Airflow Helm Chart 1.14.0 based on 1.14.0rc1

2024-06-18 Thread Jed Cunningham
Hello all,

The vote to release Apache Airflow Helm Chart version 1.14.0 based on
1.14.0rc1 is now closed.

The vote PASSED with 3 binding "+1", 2 non-binding "+1" and 0 "-1" votes:

"+1" Binding votes:
  - Jed Cunningham
  - Jarek Potiuk
  - Ephraim Anierobi

"+1" Non-Binding votes:
  - Amogh Desai
  - Rahul Vats

Vote thread:
https://lists.apache.org/thread/zxdqtdtkhwxxy2bw4257s5nfrjf02f2f

I'll continue with the release process and the release announcement will
follow shortly.

Thanks,
Jed


Re: Call with Nielsen team demoing their DAG debugging feature

2024-06-18 Thread Kaxil Naik
Hi Meghya,

There is also “airflow task test” that can run a specific task for
debugging.

On Tue, 18 Jun 2024 at 21:04, Meygha Machado
 wrote:

> Hi Jarek,
>
> Thank you for summarizing the discussion points and linking to the two PRs
> that enhance the dag test feature.
>
> We (at Nielsen) are working on a PR to extend the `airflow dag test` to be
> able to reconstruct the context from an existing dag run.
>
> I think allowing the option to execute only a specific task will also be
> helpful. The linked PR to mark tasks as success based on a pattern is handy
> for shortening the iteration time but expresses the same idea as "exclude
> all these other tasks". In cases where someone wants to debug only a
> specific task in a dag with many tasks, OR the task of interest is towards
> the end of the dag that has many branches and the task is a leaf, etc, it
> adds a little bit of extra work to figure out which tasks should be set to
> success. So specifying the task_id of interest vs specifying what all to
> exclude, will make the CLI more intuitive.
>
> Together, these features can be powerful when there is a production dag
> failure where the cause is not obvious from the logs. So the workflow would
> be:
> * production dag run failed
> * developer runs something like `airflow dag test  
>  --use-executor ` by providing the run_id of
> the failed dag run. Now they're able to get a debugger entrypoint into the
> task of interest in the executor of interest with the same context as the
> failed dag (who among us hasn't spent many hours trying to replicate a
> production failure scenario ;-) )
>
>
> On Mon, Jun 17, 2024 at 1:14 AM Amogh Desai 
> wrote:
>
> > I agree with you Jarek.
> >
> > Every developer has their own way of debugging things and sharing those
> > with the community
> > including the best practices. There is always scope for improvement to
> the
> > documentation!
> >
> > Thanks & Regards,
> > Amogh Desai
> >
> >
> > On Fri, Jun 14, 2024 at 12:17 PM Jarek Potiuk  wrote:
> >
> >> Also I would like to drag attention of those who were interested in the
> >> subject on two related PRs:
> >>
> >> * https://github.com/apache/airflow/pull/40010 by jannisko (sorry if
> you
> >> see it - I do not know your real name :) ) -> where you can run a dag
> test
> >> while skipping (or actually mark as success) some tasks (for example
> >> sensors)
> >> * https://github.com/apache/airflow/pull/40205 by Vincent -> where you
> >> can
> >> run dag test using executor rather than `_run_raw_task`
> >>
> >> I think - the debug feature that Nielsen showcased on the call falls in
> >> the
> >> same pattern "make our airflow dags test` more powerful - and it would
> be
> >> great if we could incorporate similar pattern - where you can recreate
> >> task
> >> context from already executed dag_run - as part of the `airflow dags
> test`
> >> CLI command and `dag.test()` method.
> >>
> >> This is also I think a good opportunity to enhance documentation and
> >> explain all those patterns on how you can debug dag - in
> >>
> >>
> https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/debug.html#testing-dags-with-dag-test
> >> - for now the documentation is rather bare-bone, but it would be great
> if
> >> we explain some of the best practices and use cases how dag debugging
> >> might
> >> be done using the `airflow dags test` command.
> >>
> >> I think maybe other people have their own patterns of testing DAGs that
> >> they could contribute here - both as documentation update and maybe new
> >> features of our existing "airlfow dags test" command.
> >>
> >> WDYT?
> >>
> >> J.
> >>
> >>
> >> On Thu, Jun 13, 2024 at 10:29 PM Stefan Krawczyk 
> >> wrote:
> >>
> >> > +1 for the recording please.
> >> >
> >> > On Thu, Jun 13, 2024, 1:26 PM Jarek Potiuk  wrote:
> >> >
> >> > > Just summarizing the call:
> >> > >
> >> > > * we had demo from Nielsen showing their debugging feature
> >> > > * It's based on environments created in the research environment
> where
> >> > > users can run DAGs and debug individual tasks - basically replaying
> >> and
> >> > > debugging the tasks based on existing DAG runs but without saving
> any
> >> > > changes to state of the dag in the DB
> >> > > * pretty useful thing is the way how they can use existing DAG run
> to
> >> > > recreate the context of execution based on existing dag run
> >> > > * Nielsen team used it with Airflow 1.10 and they will look into how
> >> the
> >> > > new `dag.test()` feature from Airflow 2.5 can be plugged into it and
> >> come
> >> > > back to it
> >> > > * nice thing is that they hooked it up with VSCode plugin where they
> >> can
> >> > > easily do all that within VSCode and can debug it out-of-the-box
> >> > > * possibly they could generalise it either as a "what could be done
> by
> >> > > others" description or maybe even having VSCode-from-airflow
> >> > out-of-the-box
> >> > > (the latter was my brainstorming idea).
> >> > >
> >> > > I have a 

Re: Call with Nielsen team demoing their DAG debugging feature

2024-06-18 Thread Meygha Machado
Hi Jarek,

Thank you for summarizing the discussion points and linking to the two PRs
that enhance the dag test feature.

We (at Nielsen) are working on a PR to extend the `airflow dag test` to be
able to reconstruct the context from an existing dag run.

I think allowing the option to execute only a specific task will also be
helpful. The linked PR to mark tasks as success based on a pattern is handy
for shortening the iteration time but expresses the same idea as "exclude
all these other tasks". In cases where someone wants to debug only a
specific task in a dag with many tasks, OR the task of interest is towards
the end of the dag that has many branches and the task is a leaf, etc, it
adds a little bit of extra work to figure out which tasks should be set to
success. So specifying the task_id of interest vs specifying what all to
exclude, will make the CLI more intuitive.

Together, these features can be powerful when there is a production dag
failure where the cause is not obvious from the logs. So the workflow would
be:
* production dag run failed
* developer runs something like `airflow dag test  
 --use-executor ` by providing the run_id of
the failed dag run. Now they're able to get a debugger entrypoint into the
task of interest in the executor of interest with the same context as the
failed dag (who among us hasn't spent many hours trying to replicate a
production failure scenario ;-) )


On Mon, Jun 17, 2024 at 1:14 AM Amogh Desai 
wrote:

> I agree with you Jarek.
>
> Every developer has their own way of debugging things and sharing those
> with the community
> including the best practices. There is always scope for improvement to the
> documentation!
>
> Thanks & Regards,
> Amogh Desai
>
>
> On Fri, Jun 14, 2024 at 12:17 PM Jarek Potiuk  wrote:
>
>> Also I would like to drag attention of those who were interested in the
>> subject on two related PRs:
>>
>> * https://github.com/apache/airflow/pull/40010 by jannisko (sorry if you
>> see it - I do not know your real name :) ) -> where you can run a dag test
>> while skipping (or actually mark as success) some tasks (for example
>> sensors)
>> * https://github.com/apache/airflow/pull/40205 by Vincent -> where you
>> can
>> run dag test using executor rather than `_run_raw_task`
>>
>> I think - the debug feature that Nielsen showcased on the call falls in
>> the
>> same pattern "make our airflow dags test` more powerful - and it would be
>> great if we could incorporate similar pattern - where you can recreate
>> task
>> context from already executed dag_run - as part of the `airflow dags test`
>> CLI command and `dag.test()` method.
>>
>> This is also I think a good opportunity to enhance documentation and
>> explain all those patterns on how you can debug dag - in
>>
>> https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/debug.html#testing-dags-with-dag-test
>> - for now the documentation is rather bare-bone, but it would be great if
>> we explain some of the best practices and use cases how dag debugging
>> might
>> be done using the `airflow dags test` command.
>>
>> I think maybe other people have their own patterns of testing DAGs that
>> they could contribute here - both as documentation update and maybe new
>> features of our existing "airlfow dags test" command.
>>
>> WDYT?
>>
>> J.
>>
>>
>> On Thu, Jun 13, 2024 at 10:29 PM Stefan Krawczyk 
>> wrote:
>>
>> > +1 for the recording please.
>> >
>> > On Thu, Jun 13, 2024, 1:26 PM Jarek Potiuk  wrote:
>> >
>> > > Just summarizing the call:
>> > >
>> > > * we had demo from Nielsen showing their debugging feature
>> > > * It's based on environments created in the research environment where
>> > > users can run DAGs and debug individual tasks - basically replaying
>> and
>> > > debugging the tasks based on existing DAG runs but without saving any
>> > > changes to state of the dag in the DB
>> > > * pretty useful thing is the way how they can use existing DAG run to
>> > > recreate the context of execution based on existing dag run
>> > > * Nielsen team used it with Airflow 1.10 and they will look into how
>> the
>> > > new `dag.test()` feature from Airflow 2.5 can be plugged into it and
>> come
>> > > back to it
>> > > * nice thing is that they hooked it up with VSCode plugin where they
>> can
>> > > easily do all that within VSCode and can debug it out-of-the-box
>> > > * possibly they could generalise it either as a "what could be done by
>> > > others" description or maybe even having VSCode-from-airflow
>> > out-of-the-box
>> > > (the latter was my brainstorming idea).
>> > >
>> > > I have a recording - I do not want to publish it on the public
>> devlist,
>> > but
>> > > If anyone is interested - let me know and I will share.
>> > >
>> > > J.
>> > >
>> > >
>> > > On Thu, Jun 13, 2024 at 6:49 AM Albert Okiri 
>> > > wrote:
>> > >
>> > > > Hi Jarek, I'm interested in joining this call.
>> > > >
>> > > > Regards,
>> > > > Albert.
>> > > >
>> > > > On Thu, 13 Jun 2024, 07:43 

[Lazy Concensus] Remove "Experimental' banner for OpenTelemetry Metrics

2024-06-18 Thread Ferruzzi, Dennis
Hi all.   We added OTel Metrics a bit over a year ago and there have been no 
major bugs reported.  I've had a few users ask when we'll remove the 
"Experimental" banner, so I propose we do that.

If there are no objections, I'll take care of it next Tuesday.


 - ferruzzi