Hello here - we use dev@ for rather important announcements and
discussions, for raising your questions or issues - please use GitHub
Discussions or Issues (follow the templates for issues but usually if
you have questions and you don't know if it's an issue - start with
Discussions - or even ask your questions in #user-troubleshooting on
slack. You will find all the pointers at "community" page of our main
https://airflow.apache.org.

Also if you want to take part in future discussions - please subscribe
to the devlist, otherwise we have to moderate your messages (also
community page tells you how to do it)

On Fri, Nov 21, 2025 at 9:36 AM giant contend 0 <[email protected]> wrote:
>
> hi!
> i'm updating my airflow to 3.1.3 to use new version you great app
>
> can you help me with problem, when i start my simple dag i catch error with 
> auth?
> what i need to change on my docker-compose.yaml?
>
> may be it is bug?
>
>
>
>
> airflow-apiserver-1      | INFO:     172.18.0.1:33156 - "GET 
> /api/v2/dags/mssql_operator_minimal2/dagRuns/manual__2025-11-21T07%3A07%3A37%2B00%3A00/hitlDetails?task_id=check_mssql
>  HTTP/1.1" 200 OK
> airflow-apiserver-1      | INFO:     172.18.0.1:33202 - "GET 
> /api/v2/dags/mssql_operator_minimal2/dagRuns/manual__2025-11-21T07%3A07%3A37%2B00%3A00/taskInstances/check_mssql/logs/1?map_index=-1
>  HTTP/1.1" 200 OK
> airflow-apiserver-1      | INFO:     172.18.0.1:33204 - "GET 
> /ui/structure/structure_data?dag_id=mssql_operator_minimal2 HTTP/1.1" 200 OK
> airflow-worker-1         | 2025-11-21T07:09:04.307702Z [warning  ] Starting 
> call to 'airflow.sdk.api.client.Client.request', this is the 4th time calling 
> it. [airflow.sdk.api.client] loc=before.py:42
> airflow-apiserver-1      | INFO:     172.18.0.1:33214 - "GET 
> /ui/grid/structure/mssql_operator_minimal2?limit=10&order_by=-run_after 
> HTTP/1.1" 200 OK
> airflow-apiserver-1      | INFO:     172.18.0.1:33228 - "GET 
> /ui/grid/ti_summaries/mssql_operator_minimal2/manual__2025-11-21T07%3A07%3A37%2B00%3A00
>  HTTP/1.1" 200 OK
> airflow-apiserver-1      | INFO:     172.18.0.1:33224 - "GET 
> /ui/grid/runs/mssql_operator_minimal2?limit=10&order_by=-run_after HTTP/1.1" 
> 200 OK
> airflow-apiserver-1      | INFO:     172.18.0.1:33230 - "GET 
> /api/v2/dags/mssql_operator_minimal2/dagRuns/manual__2025-11-21T07%3A07%3A37%2B00%3A00/taskInstances/check_mssql/-1
>  HTTP/1.1" 200 OK
> airflow-apiserver-1      | INFO:     172.18.0.1:33246 - "GET 
> /api/v2/dags/mssql_operator_minimal2/dagRuns/manual__2025-11-21T07%3A07%3A37%2B00%3A00
>  HTTP/1.1" 200 OK
> airflow-apiserver-1      | INFO:     172.18.0.1:33254 - "GET 
> /api/v2/dags/mssql_operator_minimal2/dagRuns/manual__2025-11-21T07%3A07%3A37%2B00%3A00/hitlDetails?task_id=check_mssql
>  HTTP/1.1" 200 OK
> airflow-apiserver-1      | INFO:     172.18.0.1:33260 - "GET 
> /api/v2/dags/mssql_operator_minimal2/dagRuns/manual__2025-11-21T07%3A07%3A37%2B00%3A00/taskInstances/check_mssql/logs/1?map_index=-1
>  HTTP/1.1" 200 OK
> airflow-worker-1         | 2025-11-21T07:09:06.083702Z [info     ] Process 
> exited                 [supervisor] exit_code=-9 loc=supervisor.py:709 pid=88 
> signal_sent=SIGKILL
> airflow-worker-1         | 2025-11-21T07:09:06.095104Z [error    ] Task 
> execute_workload[2548223d-ff74-4bea-96a2-a55e66d8510b] raised unexpected: 
> HTTPError("Server error '503 Service Unavailable' for url 
> 'http://airflow-apiserver:8080/execution/task-instances/019aa53e-a97f-76d7-ba38-d7fdd122df15/run'\nFor
>  more information check: 
> https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503";) 
> [celery.app.trace] loc=trace.py:267
> airflow-worker-1         | Traceback (most recent call last):
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/celery/app/trace.py", line 
> 453, in trace_task
> airflow-worker-1         |     R = retval = fun(*args, **kwargs)
> airflow-worker-1         |                  ^^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/celery/app/trace.py", line 
> 736, in __protected_call__
> airflow-worker-1         |     return self.run(*args, **kwargs)
> airflow-worker-1         |            ^^^^^^^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/celery/executors/celery_executor_utils.py",
>  line 163, in execute_workload
> airflow-worker-1         |     supervise(
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py",
>  line 1940, in supervise
> airflow-worker-1         |     process = ActivitySubprocess.start(
> airflow-worker-1         |               ^^^^^^^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py",
>  line 954, in start
> airflow-worker-1         |     proc._on_child_started(ti=what, 
> dag_rel_path=dag_rel_path, bundle_info=bundle_info)
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py",
>  line 965, in _on_child_started
> airflow-worker-1         |     ti_context = 
> self.client.task_instances.start(ti.id, self.pid, start_date)
> airflow-worker-1         |                  
> ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/api/client.py",
>  line 215, in start
> airflow-worker-1         |     resp = 
> self.client.patch(f"task-instances/{id}/run", content=body.model_dump_json())
> airflow-worker-1         |            
> ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/httpx/_client.py", line 
> 1218, in patch
> airflow-worker-1         |     return self.request(
> airflow-worker-1         |            ^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/tenacity/__init__.py", 
> line 338, in wrapped_f
> airflow-worker-1         |     return copy(f, *args, **kw)
> airflow-worker-1         |            ^^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/tenacity/__init__.py", 
> line 477, in __call__
> airflow-worker-1         |     do = self.iter(retry_state=retry_state)
> airflow-worker-1         |          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/tenacity/__init__.py", 
> line 378, in iter
> airflow-worker-1         |     result = action(retry_state)
> airflow-worker-1         |              ^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/tenacity/__init__.py", 
> line 420, in exc_check
> airflow-worker-1         |     raise retry_exc.reraise()
> airflow-worker-1         |           ^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/tenacity/__init__.py", 
> line 187, in reraise
> airflow-worker-1         |     raise self.last_attempt.result()
> airflow-worker-1         |           ^^^^^^^^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/usr/python/lib/python3.12/concurrent/futures/_base.py", line 449, in result
> airflow-worker-1         |     return self.__get_result()
> airflow-worker-1         |            ^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/usr/python/lib/python3.12/concurrent/futures/_base.py", line 401, in 
> __get_result
> airflow-worker-1         |     raise self._exception
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/tenacity/__init__.py", 
> line 480, in __call__
> airflow-worker-1         |     result = fn(*args, **kwargs)
> airflow-worker-1         |              ^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/api/client.py",
>  line 885, in request
> airflow-worker-1         |     return super().request(*args, **kwargs)
> airflow-worker-1         |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/httpx/_client.py", line 
> 825, in request
> airflow-worker-1         |     return self.send(request, auth=auth, 
> follow_redirects=follow_redirects)
> airflow-worker-1         |            
> ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/httpx/_client.py", line 
> 914, in send
> airflow-worker-1         |     response = self._send_handling_auth(
> airflow-worker-1         |                ^^^^^^^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/httpx/_client.py", line 
> 942, in _send_handling_auth
> airflow-worker-1         |     response = self._send_handling_redirects(
> airflow-worker-1         |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/httpx/_client.py", line 
> 999, in _send_handling_redirects
> airflow-worker-1         |     raise exc
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/httpx/_client.py", line 
> 982, in _send_handling_redirects
> airflow-worker-1         |     hook(response)
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/api/client.py",
>  line 186, in raise_on_4xx_5xx_with_note
> airflow-worker-1         |     return get_json_error(response) or 
> response.raise_for_status()
> airflow-worker-1         |                                        
> ^^^^^^^^^^^^^^^^^^^^^^^^^^^
> airflow-worker-1         |   File 
> "/home/airflow/.local/lib/python3.12/site-packages/httpx/_models.py", line 
> 829, in raise_for_status
> airflow-worker-1         |     raise HTTPStatusError(message, 
> request=request, response=self)
> airflow-worker-1         | httpx.HTTPStatusError: Server error '503 Service 
> Unavailable' for url 
> 'http://airflow-apiserver:8080/execution/task-instances/019aa53e-a97f-76d7-ba38-d7fdd122df15/run'
> airflow-worker-1         | For more information check: 
> https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503
> airflow-worker-1         | Correlation-id=019aa53e-c698-793b-87ec-3f12da01ff86
> airflow-scheduler-1      | 2025-11-21T07:09:06.622899Z [info     ] Received 
> executor event with state failed for task instance 
> TaskInstanceKey(dag_id='mssql_operator_minimal2', task_id='check_mssql', 
> run_id='manual__2025-11-21T07:07:37+00:00', try_number=1, map_index=-1) 
> [airflow.jobs.scheduler_job_runner.SchedulerJobRunner] 
> loc=scheduler_job_runner.py:818
> airflow-scheduler-1      | 2025-11-21T07:09:06.630745Z [info     ] 
> TaskInstance Finished: dag_id=mssql_operator_minimal2, task_id=check_mssql, 
> run_id=manual__2025-11-21T07:07:37+00:00, map_index=-1, run_start_date=None, 
> run_end_date=None, run_duration=None, state=queued, 
> executor=CeleryExecutor(parallelism=32), executor_state=failed, try_number=1, 
> max_tries=0, pool=default_pool, queue=default, priority_weight=1, 
> operator=SQLExecuteQueryOperator, queued_dttm=2025-11-21 
> 07:08:58.929696+00:00, scheduled_dttm=2025-11-21 
> 07:08:58.916968+00:00,queued_by_job_id=2390, pid=None 
> [airflow.jobs.scheduler_job_runner.SchedulerJobRunner] 
> loc=scheduler_job_runner.py:864
> airflow-scheduler-1      | 2025-11-21T07:09:06.632601Z [error    ] Executor 
> CeleryExecutor(parallelism=32) reported that the task instance <TaskInstance: 
> mssql_operator_minimal2.check_mssql manual__2025-11-21T07:07:37+00:00 
> [queued]> finished with state failed, but the task instance's state attribute 
> is queued. Learn more: 
> https://airflow.apache.org/docs/apache-airflow/stable/troubleshooting.html#task-state-changed-externally
>  [airflow.task] loc=taskinstance.py:1507
> airflow-scheduler-1      | 2025-11-21T07:09:06.635870Z [info     ] Marking 
> task as FAILED. dag_id=mssql_operator_minimal2, task_id=check_mssql, 
> run_id=manual__2025-11-21T07:07:37+00:00, logical_date=20251121T070737, 
> start_date=, end_date=20251121T070906 [airflow.models.taskinstance] 
> loc=taskinstance.py:1597
> airflow-dag-processor-1  | 2025-11-21T07:09:07.131382Z [info     ] Not time 
> to refresh bundle dags-folder 
> [airflow.dag_processing.manager.DagFileProcessorManager] loc=manager.py:550
> airflow-scheduler-1      | 2025-11-21T07:09:07.664201Z [info     ] Marking 
> run <DagRun mssql_operator_minimal2 @ 2025-11-21 07:07:37+00:00: 
> manual__2025-11-21T07:07:37+00:00, state:running, queued_at: 2025-11-21 
> 07:08:58.611041+00:00. run_type: manual> failed 
> [airflow.models.dagrun.DagRun] loc=dagrun.py:1171
> airflow-scheduler-1      | 2025-11-21T07:09:07.664416Z [info     ] DagRun 
> Finished: dag_id=mssql_operator_minimal2, logical_date=2025-11-21 
> 07:07:37+00:00, run_id=manual__2025-11-21T07:07:37+00:00, 
> run_start_date=2025-11-21 07:08:58.900946+00:00, run_end_date=2025-11-21 
> 07:09:07.664341+00:00, run_duration=8.763395, state=failed, run_type=manual, 
> data_interval_start=2025-11-21 07:07:37+00:00, data_interval_end=2025-11-21 
> 07:07:37+00:00, [airflow.models.dagrun.DagRun] loc=dagrun.py:1274
> airflow-apiserver-1      | INFO:     172.18.0.1:33290 - "GET 
> /ui/grid/ti_summaries/mssql_operator_minimal2/manual__2025-11-21T07%3A07%3A37%2B00%3A00
>  HTTP/1.1" 200 OK
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to