plutaniano opened a new issue, #56128:
URL: https://github.com/apache/airflow/issues/56128
### Apache Airflow version
3.1.0
### If "Other Airflow 2 version" selected, which one?
_No response_
### What happened?
Some of my DAGs definitions, that are serialized without errors in 2.11.0,
are throwing errors when being serialized in 3.0.0. It seems like some argument
names and orderings are not being handled correctly by `@task`.
More info in the reproduction steps.
### What you think should happen instead?
The example DAG in the reproduction steps should be serialized without
errors.
### How to reproduce
Create the three following files:
```python
# example_1.py
from airflow.decorators import dag, task
@dag()
def example_1():
@task
def foo(end_date, start_date): ...
foo(None, None)
example_1()
```
```python
# example_2.py
from airflow.decorators import dag, task
@dag()
def example_2():
@task
def foo(start_date, end_date): ...
foo(None, None)
example_2()
```
```shell
# test.sh
docker run \
--rm \
--entrypoint "/bin/bash" \
--volume .:/opt/airflow/dags/ \
"apache/airflow:${1}" \
-c \
"
airflow db migrate > /dev/null 2>&1;
airflow dags report 2>/dev/null;
"
```
Running `./test.sh 2.11.0` successfully serializes both dags, but running
`./test.sh 3.0.0` or `./test.sh 3.1.0` only serializes one of the dags:
```
lucas@tiny ~/Desktop/poc % ./test.sh 2.11.0
file | duration | dag_num | task_num | dags
==============+================+=========+==========+==========
/example_1.py | 0:00:00.031625 | 1 | 1 | example_1
/example_2.py | 0:00:00.000719 | 1 | 1 | example_2
lucas@tiny ~/Desktop/poc % ./test.sh 3.0.0
file | duration | dag_num | task_num | dags
==============+================+=========+==========+==========
/example_1.py | 0:00:00.055614 | 1 | 1 | example_1
/example_2.py | 0:00:00.001864 | 0 | 0 |
lucas@tiny ~/Desktop/poc % ./test.sh 3.1.0
file | duration | dag_num | task_num | dags
==============+================+=========+==========+==========
/example_1.py | 0:00:00.015002 | 1 | 1 | example_1
/example_2.py | 0:00:00.007117 | 0 | 0 |
```
This is the traceback of the error I get when serializing the DAG in >=3.0.0:
```
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/decorator.py",
line 214, in __init__
signature = signature.replace(parameters=parameters)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/python/lib/python3.12/inspect.py", line 3109, in replace
return type(self)(parameters,
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/python/lib/python3.12/inspect.py", line 3065, in __init__
raise ValueError(msg)
ValueError: non-default argument follows default argument
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/models/dagbag.py",
line 405, in parse
loader.exec_module(new_module)
File "<frozen importlib._bootstrap_external>", line 999, in exec_module
File "<frozen importlib._bootstrap>", line 488, in
_call_with_frames_removed
File "/opt/airflow/dags/example_2.py", line 12, in <module>
example_2()
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/definitions/dag.py",
line 1514, in factory
f(**f_kwargs)
File "/opt/airflow/dags/example_2.py", line 9, in example_2
foo(None, None)
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/decorator.py",
line 363, in __call__
op = self.operator_class(
^^^^^^^^^^^^^^^^^^^^
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/operator.py",
line 521, in apply_defaults
result = func(self, **kwargs, default_args=default_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/standard/decorators/python.py",
line 58, in __init__
super().__init__(
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/operator.py",
line 521, in apply_defaults
result = func(self, **kwargs, default_args=default_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/decorator.py",
line 229, in __init__
raise ValueError(message) from err
ValueError:
The function signature broke while assigning defaults to context key
parameters.
The decorator is replacing the signature
> foo(start_date, end_date)
with
> foo(start_date=None, end_date)
which isn't valid: non-default argument follows default argument
```
### Operating System
Debian 12
### Versions of Apache Airflow Providers
None
### Deployment
Other
### Deployment details
Not relevant.
### Anything else?
_No response_
### Are you willing to submit PR?
- [x] Yes I am willing to submit a PR!
### Code of Conduct
- [x] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]