emiliadecaudin commented on issue #50802:
URL: https://github.com/apache/airflow/issues/50802#issuecomment-2894782936

   > Can you please share error that your getting in the worker process
   
   Of course; here are the full logs of the worker process, in the "minimal" 
environment I describe in the ticket. Steps taken are:
   1. Start up containers.
   2. Run the following DAG twice:
   
   ```
   import datetime
   import logging
   
   from airflow.decorators import task
   from airflow.sdk import DAG
   
   LOGGER = logging.getLogger(__name__)
   
   with DAG(
       dag_id="test_dag",
       schedule=None,
       start_date=datetime.datetime(2025, 1, 1),
   ):
   
       @task
       def test() -> None:
           LOGGER.info("Hello world!")
   
       test()
   ```
   
   3. Shutdown containers.
   
   ----
   
   ```
   2025-05-20 11:07:42.773 | 
   2025-05-20 11:07:44.768 | [2025-05-20T15:07:44.767+0000] {settings.py:345} 
DEBUG - Setting up DB connection pool (PID 7)
   2025-05-20 11:07:44.768 | [2025-05-20T15:07:44.767+0000] {settings.py:454} 
DEBUG - settings.prepare_engine_args(): Using pool settings. pool_size=5, 
max_overflow=10, pool_recycle=1800, pid=7
   2025-05-20 11:07:44.783 | [2025-05-20T15:07:44.782+0000] 
{configuration.py:852} DEBUG - Could not retrieve value from section core, for 
key asset_manager_kwargs. Skipping redaction of this conf.
   2025-05-20 11:07:44.783 | [2025-05-20T15:07:44.782+0000] 
{configuration.py:852} DEBUG - Could not retrieve value from section database, 
for key sql_alchemy_engine_args. Skipping redaction of this conf.
   2025-05-20 11:07:45.099 | [2025-05-20T15:07:45.098+0000] 
{cli_action_loggers.py:50} DEBUG - Adding <function default_action_log at 
0xffffb1ec85e0> to pre execution callback
   2025-05-20 11:07:45.221 | [2025-05-20T15:07:45.221+0000] {serde.py:359} 
DEBUG - registering decimal.Decimal for serialization
   2025-05-20 11:07:45.221 | [2025-05-20T15:07:45.221+0000] {serde.py:366} 
DEBUG - registering decimal.Decimal for deserialization
   2025-05-20 11:07:45.221 | [2025-05-20T15:07:45.221+0000] {serde.py:359} 
DEBUG - registering builtins.frozenset for serialization
   2025-05-20 11:07:45.222 | [2025-05-20T15:07:45.221+0000] {serde.py:359} 
DEBUG - registering builtins.set for serialization
   2025-05-20 11:07:45.222 | [2025-05-20T15:07:45.222+0000] {serde.py:359} 
DEBUG - registering builtins.tuple for serialization
   2025-05-20 11:07:45.222 | [2025-05-20T15:07:45.222+0000] {serde.py:366} 
DEBUG - registering builtins.frozenset for deserialization
   2025-05-20 11:07:45.222 | [2025-05-20T15:07:45.222+0000] {serde.py:366} 
DEBUG - registering builtins.set for deserialization
   2025-05-20 11:07:45.223 | [2025-05-20T15:07:45.222+0000] {serde.py:366} 
DEBUG - registering builtins.tuple for deserialization
   2025-05-20 11:07:45.223 | [2025-05-20T15:07:45.223+0000] {serde.py:374} 
DEBUG - registering builtins.frozenset for stringifying
   2025-05-20 11:07:45.223 | [2025-05-20T15:07:45.223+0000] {serde.py:374} 
DEBUG - registering builtins.set for stringifying
   2025-05-20 11:07:45.223 | [2025-05-20T15:07:45.223+0000] {serde.py:374} 
DEBUG - registering builtins.tuple for stringifying
   2025-05-20 11:07:45.223 | [2025-05-20T15:07:45.223+0000] {serde.py:359} 
DEBUG - registering datetime.date for serialization
   2025-05-20 11:07:45.224 | [2025-05-20T15:07:45.223+0000] {serde.py:359} 
DEBUG - registering datetime.datetime for serialization
   2025-05-20 11:07:45.224 | [2025-05-20T15:07:45.224+0000] {serde.py:359} 
DEBUG - registering datetime.timedelta for serialization
   2025-05-20 11:07:45.224 | [2025-05-20T15:07:45.224+0000] {serde.py:359} 
DEBUG - registering pendulum.datetime.DateTime for serialization
   2025-05-20 11:07:45.224 | [2025-05-20T15:07:45.224+0000] {serde.py:366} 
DEBUG - registering datetime.date for deserialization
   2025-05-20 11:07:45.224 | [2025-05-20T15:07:45.224+0000] {serde.py:366} 
DEBUG - registering datetime.datetime for deserialization
   2025-05-20 11:07:45.224 | [2025-05-20T15:07:45.224+0000] {serde.py:366} 
DEBUG - registering datetime.timedelta for deserialization
   2025-05-20 11:07:45.224 | [2025-05-20T15:07:45.224+0000] {serde.py:366} 
DEBUG - registering pendulum.datetime.DateTime for deserialization
   2025-05-20 11:07:45.225 | [2025-05-20T15:07:45.225+0000] {serde.py:359} 
DEBUG - registering deltalake.table.DeltaTable for serialization
   2025-05-20 11:07:45.225 | [2025-05-20T15:07:45.225+0000] {serde.py:366} 
DEBUG - registering deltalake.table.DeltaTable for deserialization
   2025-05-20 11:07:45.225 | [2025-05-20T15:07:45.225+0000] {serde.py:374} 
DEBUG - registering deltalake.table.DeltaTable for stringifying
   2025-05-20 11:07:45.225 | [2025-05-20T15:07:45.225+0000] {serde.py:359} 
DEBUG - registering pyiceberg.table.Table for serialization
   2025-05-20 11:07:45.225 | [2025-05-20T15:07:45.225+0000] {serde.py:366} 
DEBUG - registering pyiceberg.table.Table for deserialization
   2025-05-20 11:07:45.226 | [2025-05-20T15:07:45.225+0000] {serde.py:374} 
DEBUG - registering pyiceberg.table.Table for stringifying
   2025-05-20 11:07:45.226 | [2025-05-20T15:07:45.226+0000] {serde.py:359} 
DEBUG - registering 
kubernetes.client.models.v1_resource_requirements.V1ResourceRequirements for 
serialization
   2025-05-20 11:07:45.226 | [2025-05-20T15:07:45.226+0000] {serde.py:359} 
DEBUG - registering kubernetes.client.models.v1_pod.V1Pod for serialization
   2025-05-20 11:07:45.226 | [2025-05-20T15:07:45.226+0000] {serde.py:359} 
DEBUG - registering numpy.int8 for serialization
   2025-05-20 11:07:45.226 | [2025-05-20T15:07:45.226+0000] {serde.py:359} 
DEBUG - registering numpy.int16 for serialization
   2025-05-20 11:07:45.227 | [2025-05-20T15:07:45.226+0000] {serde.py:359} 
DEBUG - registering numpy.int32 for serialization
   2025-05-20 11:07:45.227 | [2025-05-20T15:07:45.226+0000] {serde.py:359} 
DEBUG - registering numpy.int64 for serialization
   2025-05-20 11:07:45.227 | [2025-05-20T15:07:45.227+0000] {serde.py:359} 
DEBUG - registering numpy.uint8 for serialization
   2025-05-20 11:07:45.227 | [2025-05-20T15:07:45.227+0000] {serde.py:359} 
DEBUG - registering numpy.uint16 for serialization
   2025-05-20 11:07:45.227 | [2025-05-20T15:07:45.227+0000] {serde.py:359} 
DEBUG - registering numpy.uint32 for serialization
   2025-05-20 11:07:45.227 | [2025-05-20T15:07:45.227+0000] {serde.py:359} 
DEBUG - registering numpy.uint64 for serialization
   2025-05-20 11:07:45.227 | [2025-05-20T15:07:45.227+0000] {serde.py:359} 
DEBUG - registering numpy.bool_ for serialization
   2025-05-20 11:07:45.227 | [2025-05-20T15:07:45.227+0000] {serde.py:359} 
DEBUG - registering numpy.float64 for serialization
   2025-05-20 11:07:45.228 | [2025-05-20T15:07:45.227+0000] {serde.py:359} 
DEBUG - registering numpy.float16 for serialization
   2025-05-20 11:07:45.228 | [2025-05-20T15:07:45.228+0000] {serde.py:359} 
DEBUG - registering numpy.complex128 for serialization
   2025-05-20 11:07:45.228 | [2025-05-20T15:07:45.228+0000] {serde.py:359} 
DEBUG - registering numpy.complex64 for serialization
   2025-05-20 11:07:45.228 | [2025-05-20T15:07:45.228+0000] {serde.py:366} 
DEBUG - registering numpy.int8 for deserialization
   2025-05-20 11:07:45.228 | [2025-05-20T15:07:45.228+0000] {serde.py:366} 
DEBUG - registering numpy.int16 for deserialization
   2025-05-20 11:07:45.228 | [2025-05-20T15:07:45.228+0000] {serde.py:366} 
DEBUG - registering numpy.int32 for deserialization
   2025-05-20 11:07:45.228 | [2025-05-20T15:07:45.228+0000] {serde.py:366} 
DEBUG - registering numpy.int64 for deserialization
   2025-05-20 11:07:45.229 | [2025-05-20T15:07:45.228+0000] {serde.py:366} 
DEBUG - registering numpy.uint8 for deserialization
   2025-05-20 11:07:45.229 | [2025-05-20T15:07:45.229+0000] {serde.py:366} 
DEBUG - registering numpy.uint16 for deserialization
   2025-05-20 11:07:45.229 | [2025-05-20T15:07:45.229+0000] {serde.py:366} 
DEBUG - registering numpy.uint32 for deserialization
   2025-05-20 11:07:45.229 | [2025-05-20T15:07:45.229+0000] {serde.py:366} 
DEBUG - registering numpy.uint64 for deserialization
   2025-05-20 11:07:45.229 | [2025-05-20T15:07:45.229+0000] {serde.py:366} 
DEBUG - registering numpy.bool_ for deserialization
   2025-05-20 11:07:45.229 | [2025-05-20T15:07:45.229+0000] {serde.py:366} 
DEBUG - registering numpy.float64 for deserialization
   2025-05-20 11:07:45.229 | [2025-05-20T15:07:45.229+0000] {serde.py:366} 
DEBUG - registering numpy.float16 for deserialization
   2025-05-20 11:07:45.229 | [2025-05-20T15:07:45.229+0000] {serde.py:366} 
DEBUG - registering numpy.complex128 for deserialization
   2025-05-20 11:07:45.230 | [2025-05-20T15:07:45.229+0000] {serde.py:366} 
DEBUG - registering numpy.complex64 for deserialization
   2025-05-20 11:07:45.230 | [2025-05-20T15:07:45.230+0000] {serde.py:359} 
DEBUG - registering pandas.core.frame.DataFrame for serialization
   2025-05-20 11:07:45.230 | [2025-05-20T15:07:45.230+0000] {serde.py:366} 
DEBUG - registering pandas.core.frame.DataFrame for deserialization
   2025-05-20 11:07:45.230 | [2025-05-20T15:07:45.230+0000] {serde.py:359} 
DEBUG - registering pendulum.tz.timezone.FixedTimezone for serialization
   2025-05-20 11:07:45.230 | [2025-05-20T15:07:45.230+0000] {serde.py:359} 
DEBUG - registering pendulum.tz.timezone.Timezone for serialization
   2025-05-20 11:07:45.230 | [2025-05-20T15:07:45.230+0000] {serde.py:359} 
DEBUG - registering zoneinfo.ZoneInfo for serialization
   2025-05-20 11:07:45.230 | [2025-05-20T15:07:45.230+0000] {serde.py:366} 
DEBUG - registering pendulum.tz.timezone.FixedTimezone for deserialization
   2025-05-20 11:07:45.231 | [2025-05-20T15:07:45.230+0000] {serde.py:366} 
DEBUG - registering pendulum.tz.timezone.Timezone for deserialization
   2025-05-20 11:07:45.231 | [2025-05-20T15:07:45.230+0000] {serde.py:366} 
DEBUG - registering zoneinfo.ZoneInfo for deserialization
   2025-05-20 11:07:45.231 | [2025-05-20T15:07:45.231+0000] {serde.py:377} 
DEBUG - loading serializers took 9.770 seconds
   2025-05-20 11:07:45.539 | [2025-05-20T15:07:45.539+0000] 
{cli_action_loggers.py:78} DEBUG - Calling callbacks: [<function 
default_action_log at 0xffffb1ec85e0>]
   2025-05-20 11:07:45.600 | [2025-05-20T15:07:45.600+0000] 
{providers_manager.py:356} DEBUG - Initializing Providers Manager[config]
   2025-05-20 11:07:45.600 | [2025-05-20T15:07:45.600+0000] 
{providers_manager.py:356} DEBUG - Initializing Providers Manager[list]
   2025-05-20 11:07:45.605 | [2025-05-20T15:07:45.605+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.odbc.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-odbc
   2025-05-20 11:07:45.606 | [2025-05-20T15:07:45.606+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.google.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-google
   2025-05-20 11:07:45.610 | [2025-05-20T15:07:45.609+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.common.sql.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package 
apache-airflow-providers-common-sql
   2025-05-20 11:07:45.610 | [2025-05-20T15:07:45.610+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.http.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-http
   2025-05-20 11:07:45.611 | [2025-05-20T15:07:45.611+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.snowflake.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-snowflake
   2025-05-20 11:07:45.611 | [2025-05-20T15:07:45.611+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.cncf.kubernetes.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package 
apache-airflow-providers-cncf-kubernetes
   2025-05-20 11:07:45.612 | [2025-05-20T15:07:45.612+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.standard.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-standard
   2025-05-20 11:07:45.613 | [2025-05-20T15:07:45.613+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.mysql.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-mysql
   2025-05-20 11:07:45.613 | [2025-05-20T15:07:45.613+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.sendgrid.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-sendgrid
   2025-05-20 11:07:45.614 | [2025-05-20T15:07:45.614+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.hashicorp.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-hashicorp
   2025-05-20 11:07:45.614 | [2025-05-20T15:07:45.614+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.smtp.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-smtp
   2025-05-20 11:07:45.615 | [2025-05-20T15:07:45.615+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.fab.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-fab
   2025-05-20 11:07:45.615 | [2025-05-20T15:07:45.615+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.ftp.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-ftp
   2025-05-20 11:07:45.616 | [2025-05-20T15:07:45.616+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.redis.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-redis
   2025-05-20 11:07:45.616 | [2025-05-20T15:07:45.616+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.postgres.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-postgres
   2025-05-20 11:07:45.617 | [2025-05-20T15:07:45.617+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.common.io.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-common-io
   2025-05-20 11:07:45.617 | [2025-05-20T15:07:45.617+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.elasticsearch.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package 
apache-airflow-providers-elasticsearch
   2025-05-20 11:07:45.618 | [2025-05-20T15:07:45.618+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.docker.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-docker
   2025-05-20 11:07:45.619 | [2025-05-20T15:07:45.618+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.slack.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-slack
   2025-05-20 11:07:45.619 | [2025-05-20T15:07:45.619+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.git.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-git
   2025-05-20 11:07:45.620 | [2025-05-20T15:07:45.619+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.grpc.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-grpc
   2025-05-20 11:07:45.620 | [2025-05-20T15:07:45.620+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.microsoft.azure.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package 
apache-airflow-providers-microsoft-azure
   2025-05-20 11:07:45.622 | [2025-05-20T15:07:45.622+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.amazon.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-amazon
   2025-05-20 11:07:45.625 | [2025-05-20T15:07:45.625+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.common.messaging.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package 
apache-airflow-providers-common-messaging
   2025-05-20 11:07:45.625 | [2025-05-20T15:07:45.625+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.common.compat.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package 
apache-airflow-providers-common-compat
   2025-05-20 11:07:45.626 | [2025-05-20T15:07:45.626+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.openlineage.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package 
apache-airflow-providers-openlineage
   2025-05-20 11:07:45.626 | [2025-05-20T15:07:45.626+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.celery.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-celery
   2025-05-20 11:07:45.628 | [2025-05-20T15:07:45.627+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.sftp.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-sftp
   2025-05-20 11:07:45.628 | [2025-05-20T15:07:45.628+0000] 
{providers_manager.py:598} DEBUG - Loading EntryPoint(name='provider_info', 
value='airflow.providers.ssh.get_provider_info:get_provider_info', 
group='apache_airflow_provider') from package apache-airflow-providers-ssh
   2025-05-20 11:07:45.628 | [2025-05-20T15:07:45.628+0000] 
{providers_manager.py:359} DEBUG - Initialization of Providers Manager[list] 
took 0.03 seconds
   2025-05-20 11:07:45.628 | [2025-05-20T15:07:45.628+0000] 
{configuration.py:1862} DEBUG - Loading providers configuration
   2025-05-20 11:07:45.635 | [2025-05-20T15:07:45.635+0000] 
{providers_manager.py:359} DEBUG - Initialization of Providers Manager[config] 
took 0.04 seconds
   2025-05-20 11:07:45.716 | [2025-05-20T15:07:45.716+0000] {base.py:65} INFO - 
Connection Retrieved 'aws_default'
   2025-05-20 11:07:45.716 | [2025-05-20T15:07:45.716+0000] 
{connection_wrapper.py:325} INFO - AWS Connection (conn_id='aws_default', 
conn_type='aws') credentials retrieved from login and password.
   2025-05-20 11:07:45.717 | [2025-05-20T15:07:45.717+0000] {hooks.py:482} 
DEBUG - Changing event name from creating-client-class.iot-data to 
creating-client-class.iot-data-plane
   2025-05-20 11:07:45.718 | [2025-05-20T15:07:45.718+0000] {hooks.py:482} 
DEBUG - Changing event name from before-call.apigateway to 
before-call.api-gateway
   2025-05-20 11:07:45.718 | [2025-05-20T15:07:45.718+0000] {hooks.py:482} 
DEBUG - Changing event name from request-created.machinelearning.Predict to 
request-created.machine-learning.Predict
   2025-05-20 11:07:45.718 | [2025-05-20T15:07:45.718+0000] {hooks.py:482} 
DEBUG - Changing event name from 
before-parameter-build.autoscaling.CreateLaunchConfiguration to 
before-parameter-build.auto-scaling.CreateLaunchConfiguration
   2025-05-20 11:07:45.718 | [2025-05-20T15:07:45.718+0000] {hooks.py:482} 
DEBUG - Changing event name from before-parameter-build.route53 to 
before-parameter-build.route-53
   2025-05-20 11:07:45.719 | [2025-05-20T15:07:45.719+0000] {hooks.py:482} 
DEBUG - Changing event name from request-created.cloudsearchdomain.Search to 
request-created.cloudsearch-domain.Search
   2025-05-20 11:07:45.719 | [2025-05-20T15:07:45.719+0000] {hooks.py:482} 
DEBUG - Changing event name from 
docs.*.autoscaling.CreateLaunchConfiguration.complete-section to 
docs.*.auto-scaling.CreateLaunchConfiguration.complete-section
   2025-05-20 11:07:45.720 | [2025-05-20T15:07:45.720+0000] {hooks.py:482} 
DEBUG - Changing event name from before-parameter-build.logs.CreateExportTask 
to before-parameter-build.cloudwatch-logs.CreateExportTask
   2025-05-20 11:07:45.720 | [2025-05-20T15:07:45.720+0000] {hooks.py:482} 
DEBUG - Changing event name from docs.*.logs.CreateExportTask.complete-section 
to docs.*.cloudwatch-logs.CreateExportTask.complete-section
   2025-05-20 11:07:45.720 | [2025-05-20T15:07:45.720+0000] {hooks.py:482} 
DEBUG - Changing event name from 
before-parameter-build.cloudsearchdomain.Search to 
before-parameter-build.cloudsearch-domain.Search
   2025-05-20 11:07:45.720 | [2025-05-20T15:07:45.720+0000] {hooks.py:482} 
DEBUG - Changing event name from 
docs.*.cloudsearchdomain.Search.complete-section to 
docs.*.cloudsearch-domain.Search.complete-section
   2025-05-20 11:07:45.721 | [2025-05-20T15:07:45.720+0000] {session.py:379} 
DEBUG - Setting config variable for region to 'us-east-1'
   2025-05-20 11:07:45.721 | [2025-05-20T15:07:45.721+0000] 
{providers_manager.py:356} DEBUG - Initializing Providers Manager[hooks]
   2025-05-20 11:07:45.721 | [2025-05-20T15:07:45.721+0000] 
{providers_manager.py:359} DEBUG - Initialization of Providers Manager[hooks] 
took 0.00 seconds
   2025-05-20 11:07:45.763 | [2025-05-20T15:07:45.763+0000] {loaders.py:181} 
DEBUG - Loading JSON file: 
/home/airflow/.local/lib/python3.12/site-packages/botocore/data/endpoints.json
   2025-05-20 11:07:45.771 | [2025-05-20T15:07:45.771+0000] {loaders.py:181} 
DEBUG - Loading JSON file: 
/home/airflow/.local/lib/python3.12/site-packages/botocore/data/sdk-default-configuration.json
   2025-05-20 11:07:45.771 | [2025-05-20T15:07:45.771+0000] {hooks.py:238} 
DEBUG - Event choose-service-name: calling handler <function 
handle_service_name_alias at 0xffffb5697060>
   2025-05-20 11:07:45.777 | [2025-05-20T15:07:45.777+0000] {loaders.py:181} 
DEBUG - Loading JSON file: 
/home/airflow/.local/lib/python3.12/site-packages/botocore/data/logs/2014-03-28/service-2.json.gz
   2025-05-20 11:07:45.784 | [2025-05-20T15:07:45.784+0000] {loaders.py:181} 
DEBUG - Loading JSON file: 
/home/airflow/.local/lib/python3.12/site-packages/botocore/data/logs/2014-03-28/endpoint-rule-set-1.json.gz
   2025-05-20 11:07:45.784 | [2025-05-20T15:07:45.784+0000] {loaders.py:181} 
DEBUG - Loading JSON file: 
/home/airflow/.local/lib/python3.12/site-packages/botocore/data/partitions.json
   2025-05-20 11:07:45.785 | [2025-05-20T15:07:45.785+0000] {hooks.py:238} 
DEBUG - Event creating-client-class.cloudwatch-logs: calling handler <function 
add_generate_presigned_url at 0xffffb561f920>
   2025-05-20 11:07:45.785 | [2025-05-20T15:07:45.785+0000] 
{configprovider.py:983} DEBUG - Looking for endpoint for logs via: 
environment_service
   2025-05-20 11:07:45.785 | [2025-05-20T15:07:45.785+0000] 
{configprovider.py:983} DEBUG - Looking for endpoint for logs via: 
environment_global
   2025-05-20 11:07:45.785 | [2025-05-20T15:07:45.785+0000] 
{configprovider.py:983} DEBUG - Looking for endpoint for logs via: 
config_service
   2025-05-20 11:07:45.785 | [2025-05-20T15:07:45.785+0000] 
{configprovider.py:983} DEBUG - Looking for endpoint for logs via: config_global
   2025-05-20 11:07:45.785 | [2025-05-20T15:07:45.785+0000] 
{configprovider.py:999} DEBUG - No configured endpoint found.
   2025-05-20 11:07:45.786 | [2025-05-20T15:07:45.786+0000] {endpoint.py:414} 
DEBUG - Setting logs timeout as (60, 60)
   2025-05-20 11:07:45.787 | [2025-05-20T15:07:45.787+0000] {loaders.py:181} 
DEBUG - Loading JSON file: 
/home/airflow/.local/lib/python3.12/site-packages/botocore/data/_retry.json
   2025-05-20 11:07:45.787 | [2025-05-20T15:07:45.787+0000] {client.py:289} 
DEBUG - Registering retry handlers for service: logs
   2025-05-20 11:07:45.790 | [2025-05-20T15:07:45.790+0000] {loaders.py:181} 
DEBUG - Loading JSON file: 
/home/airflow/.local/lib/python3.12/site-packages/botocore/data/logs/2014-03-28/paginators-1.json
   2025-05-20 11:07:45.791 | [2025-05-20T15:07:45.791+0000] {hooks.py:238} 
DEBUG - Event before-parameter-build.cloudwatch-logs.DescribeLogGroups: calling 
handler <function generate_idempotent_uuid at 0xffffb56b4b80>
   2025-05-20 11:07:45.791 | [2025-05-20T15:07:45.791+0000] {hooks.py:238} 
DEBUG - Event before-parameter-build.cloudwatch-logs.DescribeLogGroups: calling 
handler <function _handle_request_validation_mode_member at 0xffffb56b76a0>
   2025-05-20 11:07:45.792 | [2025-05-20T15:07:45.791+0000] {regions.py:504} 
DEBUG - Calling endpoint provider with parameters: {'Region': 'us-east-1', 
'UseDualStack': False, 'UseFIPS': False}
   2025-05-20 11:07:45.793 | [2025-05-20T15:07:45.792+0000] {regions.py:519} 
DEBUG - Endpoint provider result: https://logs.us-east-1.amazonaws.com
   2025-05-20 11:07:45.793 | [2025-05-20T15:07:45.793+0000] {hooks.py:238} 
DEBUG - Event before-call.cloudwatch-logs.DescribeLogGroups: calling handler 
<function add_recursion_detection_header at 0xffffb5697a60>
   2025-05-20 11:07:45.793 | [2025-05-20T15:07:45.793+0000] {hooks.py:238} 
DEBUG - Event before-call.cloudwatch-logs.DescribeLogGroups: calling handler 
<function add_query_compatibility_header at 0xffffb56b7600>
   2025-05-20 11:07:45.793 | [2025-05-20T15:07:45.793+0000] {hooks.py:238} 
DEBUG - Event before-call.cloudwatch-logs.DescribeLogGroups: calling handler 
<function inject_api_version_header_if_needed at 0xffffb56b6660>
   2025-05-20 11:07:45.793 | [2025-05-20T15:07:45.793+0000] {endpoint.py:114} 
DEBUG - Making request for OperationModel(name=DescribeLogGroups) with params: 
{'url_path': '/', 'query_string': '', 'method': 'POST', 'headers': 
{'X-Amz-Target': 'Logs_20140328.DescribeLogGroups', 'Content-Type': 
'application/x-amz-json-1.1', 'User-Agent': 'Boto3/1.37.3 md/Botocore#1.37.3 
ua/2.0 os/linux#6.10.14-linuxkit md/arch#aarch64 lang/python#3.12.10 
md/pyimpl#CPython cfg/retry-mode#legacy Botocore/1.37.3 Airflow/3.0.1 
AmPP/9.7.0 Caller/Unknown DagRunKey/00000000-0000-0000-0000-000000000000'}, 
'body': b'{"logGroupNamePrefix": "REDACTED"}', 'url': 
'https://logs.us-east-1.amazonaws.com/', 'context': {'client_region': 
'us-east-1', 'client_config': <botocore.config.Config object at 
0xffffafeb6870>, 'has_streaming_input': False, 'auth_type': None, 
'unsigned_payload': None}}
   2025-05-20 11:07:45.794 | [2025-05-20T15:07:45.793+0000] {hooks.py:238} 
DEBUG - Event request-created.cloudwatch-logs.DescribeLogGroups: calling 
handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner 
object at 0xffffafeb7ec0>>
   2025-05-20 11:07:45.795 | [2025-05-20T15:07:45.794+0000] {hooks.py:238} 
DEBUG - Event choose-signer.cloudwatch-logs.DescribeLogGroups: calling handler 
<function set_operation_specific_signer at 0xffffb56b49a0>
   2025-05-20 11:07:45.795 | [2025-05-20T15:07:45.794+0000] {auth.py:431} DEBUG 
- Calculating signature using v4 auth.
   2025-05-20 11:07:45.795 | [2025-05-20T15:07:45.794+0000] {auth.py:432} DEBUG 
- CanonicalRequest:
   2025-05-20 11:07:45.795 | POST
   2025-05-20 11:07:45.795 | /
   2025-05-20 11:07:45.795 | 
   2025-05-20 11:07:45.795 | content-type:application/x-amz-json-1.1
   2025-05-20 11:07:45.795 | host:logs.us-east-1.amazonaws.com
   2025-05-20 11:07:45.795 | x-amz-date:20250520T150745Z
   2025-05-20 11:07:45.795 | x-amz-target:Logs_20140328.DescribeLogGroups
   2025-05-20 11:07:45.795 | 
   2025-05-20 11:07:45.795 | content-type;host;x-amz-date;x-amz-target
   2025-05-20 11:07:45.795 | 
4290dae78fd8cb546be9ea55cc92d2f03184b0dc71d52b7d98d0564e40ba21ed
   2025-05-20 11:07:45.795 | [2025-05-20T15:07:45.794+0000] {auth.py:434} DEBUG 
- StringToSign:
   2025-05-20 11:07:45.795 | AWS4-HMAC-SHA256
   2025-05-20 11:07:45.795 | 20250520T150745Z
   2025-05-20 11:07:45.795 | 20250520/us-east-1/logs/aws4_request
   2025-05-20 11:07:45.795 | 
108425f104c89d3081e6c10f9fa85bc0ee30e79ec8c74faafba4bb2bbc41d79b
   2025-05-20 11:07:45.795 | [2025-05-20T15:07:45.795+0000] {auth.py:436} DEBUG 
- Signature:
   2025-05-20 11:07:45.795 | 
64b3ed037692b15dd98d35ad91b7936119c0731554f4d3ac3a77b561d0b63538
   2025-05-20 11:07:45.795 | [2025-05-20T15:07:45.795+0000] {hooks.py:238} 
DEBUG - Event request-created.cloudwatch-logs.DescribeLogGroups: calling 
handler <function add_retry_headers at 0xffffb56b6e80>
   2025-05-20 11:07:45.796 | [2025-05-20T15:07:45.795+0000] {endpoint.py:263} 
DEBUG - Sending http request: <AWSPreparedRequest stream_output=False, 
method=POST, url=https://logs.us-east-1.amazonaws.com/, 
headers={'X-Amz-Target': b'Logs_20140328.DescribeLogGroups', 'Content-Type': 
b'application/x-amz-json-1.1', 'User-Agent': b'Boto3/1.37.3 md/Botocore#1.37.3 
ua/2.0 os/linux#6.10.14-linuxkit md/arch#aarch64 lang/python#3.12.10 
md/pyimpl#CPython cfg/retry-mode#legacy Botocore/1.37.3 Airflow/3.0.1 
AmPP/9.7.0 Caller/Unknown DagRunKey/00000000-0000-0000-0000-000000000000', 
'X-Amz-Date': b'20250520T150745Z', 'Authorization': b'AWS4-HMAC-SHA256 
Credential=AKIA5KRESCYYYRXCTOOL/20250520/us-east-1/logs/aws4_request, 
SignedHeaders=content-type;host;x-amz-date;x-amz-target, 
Signature=64b3ed037692b15dd98d35ad91b7936119c0731554f4d3ac3a77b561d0b63538', 
'amz-sdk-invocation-id': b'de8bdb51-2f49-4cad-b75a-28194b9265aa', 
'amz-sdk-request': b'attempt=1', 'Content-Length': '44'}>
   2025-05-20 11:07:45.796 | [2025-05-20T15:07:45.796+0000] {httpsession.py:97} 
DEBUG - Certificate path: 
/home/airflow/.local/lib/python3.12/site-packages/certifi/cacert.pem
   2025-05-20 11:07:45.797 | [2025-05-20T15:07:45.796+0000] 
{connectionpool.py:1049} DEBUG - Starting new HTTPS connection (1): 
logs.us-east-1.amazonaws.com:443
   2025-05-20 11:07:45.916 | [2025-05-20T15:07:45.916+0000] 
{connectionpool.py:544} DEBUG - https://logs.us-east-1.amazonaws.com:443 "POST 
/ HTTP/1.1" 200 329
   2025-05-20 11:07:45.916 | [2025-05-20T15:07:45.916+0000] {parsers.py:250} 
DEBUG - Response headers: {'x-amzn-RequestId': 
'a0c823a3-5113-4706-b7b8-2206aebdaa91', 'Content-Type': 
'application/x-amz-json-1.1', 'Content-Length': '329', 'Date': 'Tue, 20 May 
2025 15:07:45 GMT'}
   2025-05-20 11:07:45.916 | [2025-05-20T15:07:45.916+0000] {parsers.py:251} 
DEBUG - Response body:
   2025-05-20 11:07:45.916 | 
b'{"logGroups":[{"arn":"arn:aws:logs:us-east-1:REDACTED:log-group:REDACTED:*","creationTime":1747673114546,"logGroupArn":"arn:aws:logs:us-east-1:REDACTED:log-group:REDACTED","logGroupClass":"STANDARD","logGroupName":"REDACTED","metricFilterCount":0,"retentionInDays":90,"storedBytes":425554}]}'
   2025-05-20 11:07:45.917 | [2025-05-20T15:07:45.917+0000] {hooks.py:238} 
DEBUG - Event needs-retry.cloudwatch-logs.DescribeLogGroups: calling handler 
<botocore.retryhandler.RetryHandler object at 0xffffb01cb0e0>
   2025-05-20 11:07:45.917 | [2025-05-20T15:07:45.917+0000] 
{retryhandler.py:211} DEBUG - No retry needed.
   2025-05-20 11:07:45.982 | 2025-05-20 15:07:45.981918 [info     ] starting 
stale bundle cleanup process [airflow.providers.celery.cli.celery_command]
   2025-05-20 11:07:45.998 | [2025-05-20 15:07:45 +0000] [24] [INFO] Starting 
gunicorn 23.0.0
   2025-05-20 11:07:46.000 | [2025-05-20 15:07:46 +0000] [24] [INFO] Listening 
at: http://[::]:8793 (24)
   2025-05-20 11:07:46.000 | [2025-05-20 15:07:46 +0000] [24] [INFO] Using 
worker: sync
   2025-05-20 11:07:46.006 | [2025-05-20 15:07:46 +0000] [26] [INFO] Booting 
worker with pid: 26
   2025-05-20 11:07:46.044 | [2025-05-20 15:07:46 +0000] [27] [INFO] Booting 
worker with pid: 27
   2025-05-20 11:07:46.931 | 2025-05-20 15:07:46.931745 [debug    ] | Worker: 
Preparing bootsteps. [celery.bootsteps]
   2025-05-20 11:07:46.933 | 2025-05-20 15:07:46.933751 [debug    ] | Worker: 
Building graph...    [celery.bootsteps]
   2025-05-20 11:07:46.934 | 2025-05-20 15:07:46.934814 [debug    ] | Worker: 
New boot order: {Timer, Hub, Pool, Autoscaler, StateDB, Beat, Consumer} 
[celery.bootsteps]
   2025-05-20 11:07:46.942 | 2025-05-20 15:07:46.942183 [debug    ] | Consumer: 
Preparing bootsteps. [celery.bootsteps]
   2025-05-20 11:07:46.942 | 2025-05-20 15:07:46.942902 [debug    ] | Consumer: 
Building graph...  [celery.bootsteps]
   2025-05-20 11:07:46.950 | 2025-05-20 15:07:46.950797 [debug    ] | Consumer: 
New boot order: {Connection, Agent, Events, Heart, Mingle, Gossip, Tasks, 
DelayedDelivery, Control, event loop} [celery.bootsteps]
   2025-05-20 11:07:46.952 |  
   2025-05-20 11:07:46.952 |  -------------- celery@5f1d5b3f2bb4 v5.5.2 
(immunity)
   2025-05-20 11:07:46.952 | --- ***** ----- 
   2025-05-20 11:07:46.952 | -- ******* ---- 
Linux-6.10.14-linuxkit-aarch64-with-glibc2.36 2025-05-20 15:07:46
   2025-05-20 11:07:46.952 | - *** --- * --- 
   2025-05-20 11:07:46.952 | - ** ---------- [config]
   2025-05-20 11:07:46.952 | - ** ---------- .> app:         
airflow.providers.celery.executors.celery_executor:0xffffb0257ce0
   2025-05-20 11:07:46.952 | - ** ---------- .> transport:   
redis://redis:6379/0
   2025-05-20 11:07:46.952 | - ** ---------- .> results:     
postgresql://airflow:**@postgres/airflow
   2025-05-20 11:07:46.952 | - *** --- * --- .> concurrency: 16 (prefork)
   2025-05-20 11:07:46.952 | -- ******* ---- .> task events: OFF (enable -E to 
monitor tasks in this worker)
   2025-05-20 11:07:46.952 | --- ***** ----- 
   2025-05-20 11:07:46.952 |  -------------- [queues]
   2025-05-20 11:07:46.952 |                 .> default          
exchange=default(direct) key=default
   2025-05-20 11:07:46.952 |                 
   2025-05-20 11:07:46.952 | 
   2025-05-20 11:07:46.952 | [tasks]
   2025-05-20 11:07:46.952 |   . celery.accumulate
   2025-05-20 11:07:46.952 |   . celery.backend_cleanup
   2025-05-20 11:07:46.952 |   . celery.chain
   2025-05-20 11:07:46.952 |   . celery.chord
   2025-05-20 11:07:46.952 |   . celery.chord_unlock
   2025-05-20 11:07:46.952 |   . celery.chunks
   2025-05-20 11:07:46.952 |   . celery.group
   2025-05-20 11:07:46.952 |   . celery.map
   2025-05-20 11:07:46.952 |   . celery.starmap
   2025-05-20 11:07:46.952 |   . execute_workload
   2025-05-20 11:07:46.952 | 
   2025-05-20 11:07:46.952 | 2025-05-20 15:07:46.952435 [debug    ] | Worker: 
Starting Hub         [celery.bootsteps]
   2025-05-20 11:07:46.952 | 2025-05-20 15:07:46.952531 [debug    ] ^-- substep 
ok                 [celery.bootsteps]
   2025-05-20 11:07:46.952 | 2025-05-20 15:07:46.952568 [debug    ] | Worker: 
Starting Pool        [celery.bootsteps]
   2025-05-20 11:07:48.649 | 2025-05-20 15:07:48.649195 [debug    ] ^-- substep 
ok                 [celery.bootsteps]
   2025-05-20 11:07:48.649 | 2025-05-20 15:07:48.649458 [debug    ] | Worker: 
Starting Consumer    [celery.bootsteps]
   2025-05-20 11:07:48.649 | 2025-05-20 15:07:48.649698 [debug    ] | Consumer: 
Starting Connection [celery.bootsteps]
   2025-05-20 11:07:48.653 | 2025-05-20 15:07:48.653060 [info     ] Connected 
to redis://redis:6379/0 [celery.worker.consumer.connection]
   2025-05-20 11:07:48.653 | 2025-05-20 15:07:48.653121 [debug    ] ^-- substep 
ok                 [celery.bootsteps]
   2025-05-20 11:07:48.653 | 2025-05-20 15:07:48.653168 [debug    ] | Consumer: 
Starting Events    [celery.bootsteps]
   2025-05-20 11:07:48.654 | 2025-05-20 15:07:48.654399 [debug    ] ^-- substep 
ok                 [celery.bootsteps]
   2025-05-20 11:07:48.654 | 2025-05-20 15:07:48.654463 [debug    ] | Consumer: 
Starting Heart     [celery.bootsteps]
   2025-05-20 11:07:48.655 | 2025-05-20 15:07:48.655504 [debug    ] ^-- substep 
ok                 [celery.bootsteps]
   2025-05-20 11:07:48.655 | 2025-05-20 15:07:48.655553 [debug    ] | Consumer: 
Starting Mingle    [celery.bootsteps]
   2025-05-20 11:07:48.655 | 2025-05-20 15:07:48.655599 [info     ] mingle: 
searching for neighbors [celery.worker.consumer.mingle]
   2025-05-20 11:07:49.661 | 2025-05-20 15:07:49.661365 [info     ] mingle: all 
alone              [celery.worker.consumer.mingle]
   2025-05-20 11:07:49.661 | 2025-05-20 15:07:49.661564 [debug    ] ^-- substep 
ok                 [celery.bootsteps]
   2025-05-20 11:07:49.661 | 2025-05-20 15:07:49.661671 [debug    ] | Consumer: 
Starting Gossip    [celery.bootsteps]
   2025-05-20 11:07:49.663 | 2025-05-20 15:07:49.663817 [debug    ] ^-- substep 
ok                 [celery.bootsteps]
   2025-05-20 11:07:49.663 | 2025-05-20 15:07:49.663899 [debug    ] | Consumer: 
Starting Tasks     [celery.bootsteps]
   2025-05-20 11:07:49.666 | 2025-05-20 15:07:49.666622 [debug    ] ^-- substep 
ok                 [celery.bootsteps]
   2025-05-20 11:07:49.666 | 2025-05-20 15:07:49.666677 [debug    ] | Consumer: 
Starting Control   [celery.bootsteps]
   2025-05-20 11:07:49.668 | 2025-05-20 15:07:49.668350 [debug    ] ^-- substep 
ok                 [celery.bootsteps]
   2025-05-20 11:07:49.668 | 2025-05-20 15:07:49.668415 [debug    ] | Consumer: 
Starting event loop [celery.bootsteps]
   2025-05-20 11:07:49.668 | 2025-05-20 15:07:49.668493 [debug    ] | Worker: 
Hub.register Pool... [celery.bootsteps]
   2025-05-20 11:07:49.668 | 2025-05-20 15:07:49.668894 [info     ] 
celery@5f1d5b3f2bb4 ready.     [celery.apps.worker]
   2025-05-20 11:07:49.669 | 2025-05-20 15:07:49.669787 [debug    ] basic.qos: 
prefetch_count->16  [kombu.common]
   2025-05-20 11:07:49.672 | 2025-05-20 15:07:49.671900 [info     ] Task 
execute_workload[923401f6-b277-412f-81dd-b468e0c9ff16] received 
[celery.worker.strategy]
   2025-05-20 11:07:49.672 | 2025-05-20 15:07:49.672080 [debug    ] TaskPool: 
Apply <function fast_trace_task at 0xffffaefb91c0> (args:('execute_workload', 
'923401f6-b277-412f-81dd-b468e0c9ff16', {'lang': 'py', 'task': 
'execute_workload', 'id': '923401f6-b277-412f-81dd-b468e0c9ff16', 'shadow': 
None, 'eta': None, 'expires': None, 'group': None, 'group_index': None, 
'retries': 0, 'timelimit': [None, None], 'root_id': 
'923401f6-b277-412f-81dd-b468e0c9ff16', 'parent_id': None, 'argsrepr':... 
kwargs:{}) [celery.pool]
   2025-05-20 11:07:49.701 | 2025-05-20 15:07:49.701497 [info     ] 
[923401f6-b277-412f-81dd-b468e0c9ff16] Executing workload in Celery: 
token='eyJ***' ti=TaskInstance(id=UUID('0196ee3c-79aa-7f8f-aeda-ef09386fc62d'), 
task_id='test', dag_id='test_dag', 
run_id='manual__2025-05-20T15:07:42.882881+00:00', try_number=1, map_index=-1, 
pool_slots=1, queue='default', priority_weight=1, executor_config=None, 
parent_context_carrier={}, context_carrier={}, queued_dttm=None) 
dag_rel_path=PurePosixPath('test.py') 
bundle_info=BundleInfo(name='dags-folder', version=None) 
log_path='dag_id=test_dag/run_id=manual__2025-05-20T15:07:42.882881+00:00/task_id=test/attempt=1.log'
 type='ExecuteTask' [airflow.providers.celery.executors.celery_executor_utils]
   2025-05-20 11:07:49.719 | 2025-05-20 15:07:49.719150 [info     ] Secrets 
backends loaded for worker [supervisor] 
backend_classes=['EnvironmentVariablesBackend'] count=1
   2025-05-20 11:07:49.724 | 2025-05-20 15:07:49.724027 [debug    ] 
connect_tcp.started host='airflow-apiserver' port=8080 local_address=None 
timeout=5.0 socket_options=None [httpcore.connection]
   2025-05-20 11:07:49.724 | 2025-05-20 15:07:49.724611 [debug    ] 
connect_tcp.complete return_value=<httpcore._backends.sync.SyncStream object at 
0xffff8a18cda0> [httpcore.connection]
   2025-05-20 11:07:49.724 | 2025-05-20 15:07:49.724777 [debug    ] 
send_request_headers.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:49.725 | 2025-05-20 15:07:49.725181 [debug    ] 
send_request_headers.complete  [httpcore.http11]
   2025-05-20 11:07:49.725 | 2025-05-20 15:07:49.725225 [debug    ] 
send_request_body.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:49.725 | 2025-05-20 15:07:49.725318 [debug    ] 
send_request_body.complete     [httpcore.http11]
   2025-05-20 11:07:49.725 | 2025-05-20 15:07:49.725359 [debug    ] 
receive_response_headers.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:49.866 | 2025-05-20 15:07:49.866749 [debug    ] 
receive_response_headers.complete return_value=(b'HTTP/1.1', 200, b'OK', 
[(b'date', b'Tue, 20 May 2025 15:07:49 GMT'), (b'server', b'uvicorn'), 
(b'content-type', b'application/json'), (b'airflow-api-version', 
b'2025-04-28'), (b'vary', b'Accept-Encoding'), (b'content-encoding', b'gzip'), 
(b'transfer-encoding', b'chunked')]) [httpcore.http11]
   2025-05-20 11:07:49.867 | 2025-05-20 15:07:49.867738 [debug    ] 
receive_response_body.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:49.867 | 2025-05-20 15:07:49.867893 [debug    ] 
receive_response_body.complete [httpcore.http11]
   2025-05-20 11:07:49.868 | 2025-05-20 15:07:49.868385 [debug    ] Sending     
                   [supervisor] 
msg=StartupDetails(ti=TaskInstance(id=UUID('0196ee3c-79aa-7f8f-aeda-ef09386fc62d'),
 task_id='test', dag_id='test_dag', 
run_id='manual__2025-05-20T15:07:42.882881+00:00', try_number=1, map_index=-1, 
pool_slots=1, queue='default', priority_weight=1, executor_config=None, 
parent_context_carrier={}, context_carrier={}, queued_dttm=None), 
dag_rel_path='test.py', bundle_info=BundleInfo(name='dags-folder', 
version=None), requests_fd=100, start_date=datetime.datetime(2025, 5, 20, 15, 
7, 49, 722156, tzinfo=datetime.timezone.utc), 
ti_context=TIRunContext(dag_run=DagRun(dag_id='test_dag', 
run_id='manual__2025-05-20T15:07:42.882881+00:00', 
logical_date=datetime.datetime(2025, 5, 20, 15, 7, 41, 675000, 
tzinfo=TzInfo(UTC)), data_interval_start=datetime.datetime(2025, 5, 20, 15, 7, 
41, 675000, tzinfo=TzInfo(UTC)), data_interval_end=datetime.datetime(2025, 5, 
20, 15, 7, 41, 675000, tzinfo=
 TzInfo(UTC)), run_after=datetime.datetime(2025, 5, 20, 15, 7, 41, 675000, 
tzinfo=TzInfo(UTC)), start_date=datetime.datetime(2025, 5, 20, 15, 7, 43, 
572995, tzinfo=TzInfo(UTC)), end_date=None, clear_number=0, 
run_type=<DagRunType.MANUAL: 'manual'>, conf={}, consumed_asset_events=[]), 
task_reschedule_count=0, max_tries=0, variables=[], connections=[], 
upstream_map_indexes={}, next_method=None, next_kwargs=None, 
xcom_keys_to_clear=[], should_retry=False), type='StartupDetails')
   2025-05-20 11:07:49.868 | 2025-05-20 15:07:49.867961 [debug    ] 
response_closed.started        [httpcore.http11]
   2025-05-20 11:07:49.868 | 2025-05-20 15:07:49.868023 [debug    ] 
response_closed.complete       [httpcore.http11]
   2025-05-20 11:07:49.873 | 2025-05-20 15:07:49.873485 [debug    ] 
send_request_headers.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:49.873 | 2025-05-20 15:07:49.873658 [debug    ] 
send_request_headers.complete  [httpcore.http11]
   2025-05-20 11:07:49.873 | 2025-05-20 15:07:49.873697 [debug    ] 
send_request_body.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:49.873 | 2025-05-20 15:07:49.873750 [debug    ] 
send_request_body.complete     [httpcore.http11]
   2025-05-20 11:07:49.873 | 2025-05-20 15:07:49.873779 [debug    ] 
receive_response_headers.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:49.878 | 2025-05-20 15:07:49.878490 [debug    ] 
receive_response_headers.complete return_value=(b'HTTP/1.1', 204, b'No 
Content', [(b'date', b'Tue, 20 May 2025 15:07:49 GMT'), (b'server', 
b'uvicorn'), (b'content-type', b'application/json'), (b'airflow-api-version', 
b'2025-04-28')]) [httpcore.http11]
   2025-05-20 11:07:49.878 | 2025-05-20 15:07:49.878717 [debug    ] 
receive_response_body.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:49.878 | 2025-05-20 15:07:49.878768 [debug    ] 
receive_response_body.complete [httpcore.http11]
   2025-05-20 11:07:49.878 | 2025-05-20 15:07:49.878814 [debug    ] 
response_closed.started        [httpcore.http11]
   2025-05-20 11:07:49.878 | 2025-05-20 15:07:49.878847 [debug    ] 
response_closed.complete       [httpcore.http11]
   2025-05-20 11:07:49.957 | 2025-05-20 15:07:49.957792 [debug    ] Received 
message from task runner [supervisor] 
msg=SetRenderedFields(rendered_fields={'templates_dict': None, 'op_args': [], 
'op_kwargs': {}}, type='SetRenderedFields')
   2025-05-20 11:07:49.958 | 2025-05-20 15:07:49.958226 [debug    ] 
send_request_headers.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:49.958 | 2025-05-20 15:07:49.958359 [debug    ] 
send_request_headers.complete  [httpcore.http11]
   2025-05-20 11:07:49.958 | 2025-05-20 15:07:49.958394 [debug    ] 
send_request_body.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:49.958 | 2025-05-20 15:07:49.958445 [debug    ] 
send_request_body.complete     [httpcore.http11]
   2025-05-20 11:07:49.958 | 2025-05-20 15:07:49.958477 [debug    ] 
receive_response_headers.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:50.123 | 2025-05-20 15:07:50.122926 [debug    ] 
receive_response_headers.complete return_value=(b'HTTP/1.1', 201, b'Created', 
[(b'date', b'Tue, 20 May 2025 15:07:49 GMT'), (b'server', b'uvicorn'), 
(b'content-type', b'application/json'), (b'airflow-api-version', 
b'2025-04-28'), (b'vary', b'Accept-Encoding'), (b'content-encoding', b'gzip'), 
(b'transfer-encoding', b'chunked')]) [httpcore.http11]
   2025-05-20 11:07:50.123 | 2025-05-20 15:07:50.123211 [debug    ] 
receive_response_body.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:50.123 | 2025-05-20 15:07:50.123314 [debug    ] 
receive_response_body.complete [httpcore.http11]
   2025-05-20 11:07:50.123 | 2025-05-20 15:07:50.123361 [debug    ] 
response_closed.started        [httpcore.http11]
   2025-05-20 11:07:50.123 | 2025-05-20 15:07:50.123396 [debug    ] 
response_closed.complete       [httpcore.http11]
   2025-05-20 11:07:50.124 | 2025-05-20 15:07:50.124609 [debug    ] Received 
message from task runner [supervisor] msg=SucceedTask(state='success', 
end_date=datetime.datetime(2025, 5, 20, 15, 7, 49, 958659, tzinfo=TzInfo(UTC)), 
task_outlets=[], outlet_events=[], rendered_map_index=None, type='SucceedTask')
   2025-05-20 11:07:50.125 | 2025-05-20 15:07:50.124934 [debug    ] 
send_request_headers.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:50.125 | 2025-05-20 15:07:50.125103 [debug    ] 
send_request_headers.complete  [httpcore.http11]
   2025-05-20 11:07:50.125 | 2025-05-20 15:07:50.125143 [debug    ] 
send_request_body.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:50.125 | 2025-05-20 15:07:50.125207 [debug    ] 
send_request_body.complete     [httpcore.http11]
   2025-05-20 11:07:50.125 | 2025-05-20 15:07:50.125240 [debug    ] 
receive_response_headers.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:50.132 | 2025-05-20 15:07:50.132760 [debug    ] 
receive_response_headers.complete return_value=(b'HTTP/1.1', 204, b'No 
Content', [(b'date', b'Tue, 20 May 2025 15:07:49 GMT'), (b'server', 
b'uvicorn'), (b'content-type', b'application/json'), (b'airflow-api-version', 
b'2025-04-28')]) [httpcore.http11]
   2025-05-20 11:07:50.133 | 2025-05-20 15:07:50.132979 [debug    ] 
receive_response_body.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:50.133 | 2025-05-20 15:07:50.133030 [debug    ] 
receive_response_body.complete [httpcore.http11]
   2025-05-20 11:07:50.133 | 2025-05-20 15:07:50.133074 [debug    ] 
response_closed.started        [httpcore.http11]
   2025-05-20 11:07:50.133 | 2025-05-20 15:07:50.133102 [debug    ] 
response_closed.complete       [httpcore.http11]
   2025-05-20 11:07:50.133 | 2025-05-20 15:07:50.133768 [debug    ] Event 
before-parameter-build.cloudwatch-logs.PutLogEvents: calling handler <function 
generate_idempotent_uuid at 0xffffb56b4b80> [botocore.hooks]
   2025-05-20 11:07:50.133 | 2025-05-20 15:07:50.133920 [debug    ] Event 
before-parameter-build.cloudwatch-logs.PutLogEvents: calling handler <function 
_handle_request_validation_mode_member at 0xffffb56b76a0> [botocore.hooks]
   2025-05-20 11:07:50.134 | 2025-05-20 15:07:50.134085 [debug    ] Calling 
endpoint provider with parameters: {'Region': 'us-east-1', 'UseDualStack': 
False, 'UseFIPS': False} [botocore.regions]
   2025-05-20 11:07:50.134 | 2025-05-20 15:07:50.134142 [debug    ] Endpoint 
provider result: https://logs.us-east-1.amazonaws.com [botocore.regions]
   2025-05-20 11:07:50.134 | 2025-05-20 15:07:50.134609 [debug    ] Event 
before-call.cloudwatch-logs.PutLogEvents: calling handler <function 
add_recursion_detection_header at 0xffffb5697a60> [botocore.hooks]
   2025-05-20 11:07:50.134 | 2025-05-20 15:07:50.134649 [debug    ] Event 
before-call.cloudwatch-logs.PutLogEvents: calling handler <function 
add_query_compatibility_header at 0xffffb56b7600> [botocore.hooks]
   2025-05-20 11:07:50.134 | 2025-05-20 15:07:50.134686 [debug    ] Event 
before-call.cloudwatch-logs.PutLogEvents: calling handler <function 
inject_api_version_header_if_needed at 0xffffb56b6660> [botocore.hooks]
   2025-05-20 11:07:50.134 | 2025-05-20 15:07:50.134772 [debug    ] Making 
request for OperationModel(name=PutLogEvents) with params: {'url_path': '/', 
'query_string': '', 'method': 'POST', 'headers': {'X-Amz-Target': 
'Logs_20140328.PutLogEvents', 'Content-Type': 'application/x-amz-json-1.1', 
'User-Agent': 'Boto3/1.37.3 md/Botocore#1.37.3 ua/2.0 os/linux#6.10.14-linuxkit 
md/arch#aarch64 lang/python#3.12.10 md/pyimpl#CPython cfg/retry-mode#legacy 
Botocore/1.37.3 Airflow/3.0.1 AmPP/9.7.0 Caller/Unknown 
DagRunKey/00000000-0000-0000-0000-000000000000'}, 'body': b'{"logGroupName": 
"REDACTED", "logStreamName": 
"dag_id=test_dag/run_id=manual__2025-05-20T15_07_42.882881+00_00/task_id=test/attempt=1.log",
 "logEvents": [{"timestamp": 1747753669872, "message": "{\\"logger\\": 
\\"airflow.plugins_manager\\", \\"event\\": \\"Loading plugins\\", \\"level\\": 
\\"debug\\"}"}, {"timestamp": 1747753669879, "message": "{\\"logger\\": 
\\"airflow.plugins_manager\\", \\"event\\": \\"Loading plugins from di
 rectory: /opt/airflow/plugins\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753669879, "message": "{\\"logger\\": \\"airflow.plugins_manager\\", 
\\"event\\": \\"Loading plugins from entrypoints\\", \\"level\\": 
\\"debug\\"}"}, {"timestamp": 1747753669879, "message": "{\\"logger\\": 
\\"airflow.plugins_manager\\", \\"event\\": \\"Importing entry_point plugin 
openlineage\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 1747753669942, 
"message": "{\\"logger\\": \\"airflow.plugins_manager\\", \\"event\\": 
\\"Loading 1 plugin(s) took 70.19 seconds\\", \\"level\\": \\"debug\\"}"}, 
{"timestamp": 1747753669942, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Calling \'on_starting\' with 
{\'component\': <airflow.sdk.execution_time.task_runner.TaskRunnerMarker object 
at 0xffff8a218d70>}\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753669942, "message": "{\\"logger\\": \\"airflow.listeners.listener\\", 
\\"event\\": \\"Hook impls: []\\", \\"level\\": \\"debu
 g\\"}"}, {"timestamp": 1747753669942, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Result from \'on_starting\': 
[]\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 1747753669943, "message": 
"{\\"logger\\": \\"airflow.dag_processing.bundles.manager.DagBundlesManager\\", 
\\"event\\": \\"DAG bundles loaded: dags-folder\\", \\"level\\": \\"info\\"}"}, 
{"timestamp": 1747753669943, "message": "{\\"logger\\": 
\\"airflow.models.dagbag.DagBag\\", \\"event\\": \\"Filling up the DagBag from 
/opt/airflow/dags/test.py\\", \\"level\\": \\"info\\"}"}, {"timestamp": 
1747753669943, "message": "{\\"logger\\": \\"airflow.models.dagbag.DagBag\\", 
\\"event\\": \\"Importing /opt/airflow/dags/test.py\\", \\"level\\": 
\\"debug\\"}"}, {"timestamp": 1747753669956, "message": "{\\"logger\\": 
\\"airflow.providers_manager\\", \\"event\\": \\"Initializing Providers 
Manager[hook_lineage_writers]\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753669956, "message": "{\\"logger\\": 
 \\"airflow.providers_manager\\", \\"event\\": \\"Initializing Providers 
Manager[taskflow_decorators]\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753669956, "message": "{\\"logger\\": \\"airflow.providers_manager\\", 
\\"event\\": \\"Initialization of Providers Manager[taskflow_decorators] took 
0.00 seconds\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 1747753669956, 
"message": "{\\"logger\\": \\"airflow.providers_manager\\", \\"event\\": 
\\"Initialization of Providers Manager[hook_lineage_writers] took 0.00 
seconds\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 1747753669956, 
"message": "{\\"logger\\": \\"airflow.models.dagbag.DagBag\\", \\"event\\": 
\\"Loaded DAG <DAG: test_dag>\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753669956, "message": "{\\"file\\": \\"test.py\\", \\"logger\\": 
\\"task\\", \\"event\\": \\"DAG file parsed\\", \\"level\\": \\"debug\\"}"}, 
{"timestamp": 1747753669957, "message": "{\\"json\\": 
\\"{\\\\\\"rendered_fields\\\\\\":{\\\\\\"templates_di
 
ct\\\\\\":null,\\\\\\"op_args\\\\\\":[],\\\\\\"op_kwargs\\\\\\":{}},\\\\\\"type\\\\\\":\\\\\\"SetRenderedFields\\\\\\"}\\\\n\\",
 \\"logger\\": \\"task\\", \\"event\\": \\"Sending request\\", \\"level\\": 
\\"debug\\"}"}, {"timestamp": 1747753670123, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Calling 
\'on_task_instance_running\' with {\'previous_state\': 
<TaskInstanceState.QUEUED: \'queued\'>, \'task_instance\': 
RuntimeTaskInstance(id=UUID(\'0196ee3c-79aa-7f8f-aeda-ef09386fc62d\'), 
task_id=\'test\', dag_id=\'test_dag\', 
run_id=\'manual__2025-05-20T15:07:42.882881+00:00\', try_number=1, 
map_index=-1, hostname=\'5f1d5b3f2bb4\', context_carrier={}, 
task=<Task(_PythonDecoratedOperator): test>, 
bundle_instance=LocalDagBundle(name=dags-folder), max_tries=0, 
start_date=datetime.datetime(2025, 5, 20, 15, 7, 49, 722156, 
tzinfo=TzInfo(UTC)), end_date=None, state=<TaskInstanceState.RUNNING: 
\'running\'>, is_mapped=False, rendered_map_index=None)}\\", \\"level\\"
 : \\"debug\\"}"}, {"timestamp": 1747753670123, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Hook impls: []\\", 
\\"level\\": \\"debug\\"}"}, {"timestamp": 1747753670123, "message": 
"{\\"logger\\": \\"airflow.listeners.listener\\", \\"event\\": \\"Result from 
\'on_task_instance_running\': []\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753670123, "message": "{\\"logger\\": 
\\"unusual_prefix_7e98759859e7160b3f1504a7fe18f084e0be7596_test\\", 
\\"event\\": \\"Hello world!\\", \\"level\\": \\"info\\"}"}, {"timestamp": 
1747753670123, "message": "{\\"logger\\": 
\\"airflow.task.operators.airflow.providers.standard.decorators.python._PythonDecoratedOperator\\",
 \\"event\\": \\"Done. Returned value was: None\\", \\"level\\": \\"info\\"}"}, 
{"timestamp": 1747753670124, "message": "{\\"json\\": 
\\"{\\\\\\"state\\\\\\":\\\\\\"success\\\\\\",\\\\\\"end_date\\\\\\":\\\\\\"2025-05-20T15:07:49.958659Z\\\\\\",\\\\\\"task_outlets\\\\\\":[],\\\\\\"outlet_events\\\\\\
 
":[],\\\\\\"rendered_map_index\\\\\\":null,\\\\\\"type\\\\\\":\\\\\\"SucceedTask\\\\\\"}\\\\n\\",
 \\"logger\\": \\"task\\", \\"event\\": \\"Sending request\\", \\"level\\": 
\\"debug\\"}"}, {"timestamp": 1747753670124, "message": "{\\"ti\\": 
\\"RuntimeTaskInstance(id=UUID(\'0196ee3c-79aa-7f8f-aeda-ef09386fc62d\'), 
task_id=\'test\', dag_id=\'test_dag\', 
run_id=\'manual__2025-05-20T15:07:42.882881+00:00\', try_number=1, 
map_index=-1, hostname=\'5f1d5b3f2bb4\', context_carrier={}, 
task=<Task(_PythonDecoratedOperator): test>, 
bundle_instance=LocalDagBundle(name=dags-folder), max_tries=0, 
start_date=datetime.datetime(2025, 5, 20, 15, 7, 49, 722156, 
tzinfo=TzInfo(UTC)), end_date=None, state=<TaskInstanceState.SUCCESS: 
\'success\'>, is_mapped=False, rendered_map_index=None)\\", \\"logger\\": 
\\"task\\", \\"event\\": \\"Running finalizers\\", \\"level\\": \\"debug\\"}"}, 
{"timestamp": 1747753670124, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Calling \'on_tas
 k_instance_success\' with {\'previous_state\': <TaskInstanceState.RUNNING: 
\'running\'>, \'task_instance\': 
RuntimeTaskInstance(id=UUID(\'0196ee3c-79aa-7f8f-aeda-ef09386fc62d\'), 
task_id=\'test\', dag_id=\'test_dag\', 
run_id=\'manual__2025-05-20T15:07:42.882881+00:00\', try_number=1, 
map_index=-1, hostname=\'5f1d5b3f2bb4\', context_carrier={}, 
task=<Task(_PythonDecoratedOperator): test>, 
bundle_instance=LocalDagBundle(name=dags-folder), max_tries=0, 
start_date=datetime.datetime(2025, 5, 20, 15, 7, 49, 722156, 
tzinfo=TzInfo(UTC)), end_date=None, state=<TaskInstanceState.SUCCESS: 
\'success\'>, is_mapped=False, rendered_map_index=None)}\\", \\"level\\": 
\\"debug\\"}"}, {"timestamp": 1747753670124, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Hook impls: []\\", 
\\"level\\": \\"debug\\"}"}, {"timestamp": 1747753670124, "message": 
"{\\"logger\\": \\"airflow.listeners.listener\\", \\"event\\": \\"Result from 
\'on_task_instance_success\': []\\", \\"level\\": 
 \\"debug\\"}"}, {"timestamp": 1747753670124, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Calling \'before_stopping\' 
with {\'component\': <airflow.sdk.execution_time.task_runner.TaskRunnerMarker 
object at 0xffff8a16e060>}\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753670124, "message": "{\\"logger\\": \\"airflow.listeners.listener\\", 
\\"event\\": \\"Hook impls: []\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753670124, "message": "{\\"logger\\": \\"airflow.listeners.listener\\", 
\\"event\\": \\"Result from \'before_stopping\': []\\", \\"level\\": 
\\"debug\\"}"}]}', 'url': 'https://logs.us-east-1.amazonaws.com/', 'context': 
{'client_region': 'us-east-1', 'client_config': <botocore.config.Config object 
at 0xffffafeb6870>, 'has_streaming_input': False, 'auth_type': None, 
'unsigned_payload': None}} [botocore.endpoint]
   2025-05-20 11:07:50.135 | 2025-05-20 15:07:50.134941 [debug    ] Event 
request-created.cloudwatch-logs.PutLogEvents: calling handler <bound method 
RequestSigner.handler of <botocore.signers.RequestSigner object at 
0xffffafeb7ec0>> [botocore.hooks]
   2025-05-20 11:07:50.135 | 2025-05-20 15:07:50.135016 [debug    ] Event 
choose-signer.cloudwatch-logs.PutLogEvents: calling handler <function 
set_operation_specific_signer at 0xffffb56b49a0> [botocore.hooks]
   2025-05-20 11:07:50.135 | 2025-05-20 15:07:50.135330 [debug    ] Calculating 
signature using v4 auth. [botocore.auth]
   2025-05-20 11:07:50.135 | 2025-05-20 15:07:50.135372 [debug    ] 
CanonicalRequest:
   2025-05-20 11:07:50.135 | POST
   2025-05-20 11:07:50.135 | /
   2025-05-20 11:07:50.135 | 
   2025-05-20 11:07:50.135 | content-type:application/x-amz-json-1.1
   2025-05-20 11:07:50.135 | host:logs.us-east-1.amazonaws.com
   2025-05-20 11:07:50.135 | x-amz-date:20250520T150750Z
   2025-05-20 11:07:50.135 | x-amz-target:Logs_20140328.PutLogEvents
   2025-05-20 11:07:50.135 | 
   2025-05-20 11:07:50.135 | content-type;host;x-amz-date;x-amz-target
   2025-05-20 11:07:50.135 | 
e35488513a321e511b829d7b82549579804a2c94566f8c55a40f09d6020eb6d4 [botocore.auth]
   2025-05-20 11:07:50.135 | 2025-05-20 15:07:50.135407 [debug    ] 
StringToSign:
   2025-05-20 11:07:50.135 | AWS4-HMAC-SHA256
   2025-05-20 11:07:50.135 | 20250520T150750Z
   2025-05-20 11:07:50.135 | 20250520/us-east-1/logs/aws4_request
   2025-05-20 11:07:50.135 | 
841af5818d6fb3afcecf31d712fa5ff9ade325156511d4cab25c7efc7a1db924 [botocore.auth]
   2025-05-20 11:07:50.135 | 2025-05-20 15:07:50.135474 [debug    ] Signature:
   2025-05-20 11:07:50.135 | 
b7d66b01b3dd811462dae33a17b70e31630608d73a77b4a1b1ad2108f04d3d80 [botocore.auth]
   2025-05-20 11:07:50.135 | 2025-05-20 15:07:50.135525 [debug    ] Event 
request-created.cloudwatch-logs.PutLogEvents: calling handler <function 
add_retry_headers at 0xffffb56b6e80> [botocore.hooks]
   2025-05-20 11:07:50.136 | 2025-05-20 15:07:50.136123 [debug    ] Sending 
http request: <AWSPreparedRequest stream_output=False, method=POST, 
url=https://logs.us-east-1.amazonaws.com/, headers={'X-Amz-Target': 
b'Logs_20140328.PutLogEvents', 'Content-Type': b'application/x-amz-json-1.1', 
'User-Agent': b'Boto3/1.37.3 md/Botocore#1.37.3 ua/2.0 
os/linux#6.10.14-linuxkit md/arch#aarch64 lang/python#3.12.10 md/pyimpl#CPython 
cfg/retry-mode#legacy Botocore/1.37.3 Airflow/3.0.1 AmPP/9.7.0 Caller/Unknown 
DagRunKey/00000000-0000-0000-0000-000000000000', 'X-Amz-Date': 
b'20250520T150750Z', 'Authorization': b'AWS4-HMAC-SHA256 
Credential=AKIA5KRESCYYYRXCTOOL/20250520/us-east-1/logs/aws4_request, 
SignedHeaders=content-type;host;x-amz-date;x-amz-target, 
Signature=b7d66b01b3dd811462dae33a17b70e31630608d73a77b4a1b1ad2108f04d3d80', 
'amz-sdk-invocation-id': b'25ceb800-5b35-481d-b4f6-d1572beadbeb', 
'amz-sdk-request': b'attempt=1', 'Content-Length': '7488'}> [botocore.endpoint]
   2025-05-20 11:07:50.136 | 2025-05-20 15:07:50.136368 [debug    ] Certificate 
path: /home/airflow/.local/lib/python3.12/site-packages/certifi/cacert.pem 
[botocore.httpsession]
   2025-05-20 11:07:50.173 | 2025-05-20 15:07:50.173323 [debug    ] 
https://logs.us-east-1.amazonaws.com:443 "POST / HTTP/1.1" 400 91 
[urllib3.connectionpool]
   2025-05-20 11:07:50.173 | 2025-05-20 15:07:50.173559 [debug    ] Response 
headers: {'x-amzn-RequestId': 'c8ad1c3d-344e-4798-88e6-68ba4ccbf283', 
'Content-Type': 'application/x-amz-json-1.1', 'Content-Length': '91', 'Date': 
'Tue, 20 May 2025 15:07:50 GMT'} [botocore.parsers]
   2025-05-20 11:07:50.173 | 2025-05-20 15:07:50.173831 [debug    ] Response 
body:
   2025-05-20 11:07:50.173 | 
b'{"__type":"ResourceNotFoundException","message":"The specified log stream 
does not exist."}' [botocore.parsers]
   2025-05-20 11:07:50.174 | 2025-05-20 15:07:50.174782 [debug    ] Response 
headers: {'x-amzn-RequestId': 'c8ad1c3d-344e-4798-88e6-68ba4ccbf283', 
'Content-Type': 'application/x-amz-json-1.1', 'Content-Length': '91', 'Date': 
'Tue, 20 May 2025 15:07:50 GMT'} [botocore.parsers]
   2025-05-20 11:07:50.174 | 2025-05-20 15:07:50.174817 [debug    ] Response 
body:
   2025-05-20 11:07:50.174 | 
b'{"__type":"ResourceNotFoundException","message":"The specified log stream 
does not exist."}' [botocore.parsers]
   2025-05-20 11:07:50.175 | 2025-05-20 15:07:50.174938 [debug    ] Event 
needs-retry.cloudwatch-logs.PutLogEvents: calling handler 
<botocore.retryhandler.RetryHandler object at 0xffffb01cb0e0> [botocore.hooks]
   2025-05-20 11:07:50.175 | 2025-05-20 15:07:50.175030 [debug    ] No retry 
needed.               [botocore.retryhandler]
   2025-05-20 11:07:50.175 | 2025-05-20 15:07:50.175403 [debug    ] Event 
before-parameter-build.cloudwatch-logs.CreateLogStream: calling handler 
<function generate_idempotent_uuid at 0xffffb56b4b80> [botocore.hooks]
   2025-05-20 11:07:50.175 | 2025-05-20 15:07:50.175476 [debug    ] Event 
before-parameter-build.cloudwatch-logs.CreateLogStream: calling handler 
<function _handle_request_validation_mode_member at 0xffffb56b76a0> 
[botocore.hooks]
   2025-05-20 11:07:50.175 | 2025-05-20 15:07:50.175545 [debug    ] Calling 
endpoint provider with parameters: {'Region': 'us-east-1', 'UseDualStack': 
False, 'UseFIPS': False} [botocore.regions]
   2025-05-20 11:07:50.175 | 2025-05-20 15:07:50.175592 [debug    ] Endpoint 
provider result: https://logs.us-east-1.amazonaws.com [botocore.regions]
   2025-05-20 11:07:50.175 | 2025-05-20 15:07:50.175887 [debug    ] Event 
before-call.cloudwatch-logs.CreateLogStream: calling handler <function 
add_recursion_detection_header at 0xffffb5697a60> [botocore.hooks]
   2025-05-20 11:07:50.176 | 2025-05-20 15:07:50.175957 [debug    ] Event 
before-call.cloudwatch-logs.CreateLogStream: calling handler <function 
add_query_compatibility_header at 0xffffb56b7600> [botocore.hooks]
   2025-05-20 11:07:50.176 | 2025-05-20 15:07:50.176011 [debug    ] Event 
before-call.cloudwatch-logs.CreateLogStream: calling handler <function 
inject_api_version_header_if_needed at 0xffffb56b6660> [botocore.hooks]
   2025-05-20 11:07:50.176 | 2025-05-20 15:07:50.176064 [debug    ] Making 
request for OperationModel(name=CreateLogStream) with params: {'url_path': '/', 
'query_string': '', 'method': 'POST', 'headers': {'X-Amz-Target': 
'Logs_20140328.CreateLogStream', 'Content-Type': 'application/x-amz-json-1.1', 
'User-Agent': 'Boto3/1.37.3 md/Botocore#1.37.3 ua/2.0 os/linux#6.10.14-linuxkit 
md/arch#aarch64 lang/python#3.12.10 md/pyimpl#CPython cfg/retry-mode#legacy 
Botocore/1.37.3 Airflow/3.0.1 AmPP/9.7.0 Caller/Unknown 
DagRunKey/00000000-0000-0000-0000-000000000000'}, 'body': b'{"logGroupName": 
"REDACTED", "logStreamName": 
"dag_id=test_dag/run_id=manual__2025-05-20T15_07_42.882881+00_00/task_id=test/attempt=1.log"}',
 'url': 'https://logs.us-east-1.amazonaws.com/', 'context': {'client_region': 
'us-east-1', 'client_config': <botocore.config.Config object at 
0xffffafeb6870>, 'has_streaming_input': False, 'auth_type': None, 
'unsigned_payload': None}} [botocore.endpoint]
   2025-05-20 11:07:50.176 | 2025-05-20 15:07:50.176167 [debug    ] Event 
request-created.cloudwatch-logs.CreateLogStream: calling handler <bound method 
RequestSigner.handler of <botocore.signers.RequestSigner object at 
0xffffafeb7ec0>> [botocore.hooks]
   2025-05-20 11:07:50.176 | 2025-05-20 15:07:50.176232 [debug    ] Event 
choose-signer.cloudwatch-logs.CreateLogStream: calling handler <function 
set_operation_specific_signer at 0xffffb56b49a0> [botocore.hooks]
   2025-05-20 11:07:50.176 | 2025-05-20 15:07:50.176378 [debug    ] Calculating 
signature using v4 auth. [botocore.auth]
   2025-05-20 11:07:50.176 | 2025-05-20 15:07:50.176431 [debug    ] 
CanonicalRequest:
   2025-05-20 11:07:50.176 | POST
   2025-05-20 11:07:50.176 | /
   2025-05-20 11:07:50.176 | 
   2025-05-20 11:07:50.176 | content-type:application/x-amz-json-1.1
   2025-05-20 11:07:50.176 | host:logs.us-east-1.amazonaws.com
   2025-05-20 11:07:50.176 | x-amz-date:20250520T150750Z
   2025-05-20 11:07:50.176 | x-amz-target:Logs_20140328.CreateLogStream
   2025-05-20 11:07:50.176 | 
   2025-05-20 11:07:50.176 | content-type;host;x-amz-date;x-amz-target
   2025-05-20 11:07:50.176 | 
37c09e0e26e54f0885225676e09d53d0d5145f0c6a781cea6dc2ce54952f7505 [botocore.auth]
   2025-05-20 11:07:50.176 | 2025-05-20 15:07:50.176464 [debug    ] 
StringToSign:
   2025-05-20 11:07:50.176 | AWS4-HMAC-SHA256
   2025-05-20 11:07:50.176 | 20250520T150750Z
   2025-05-20 11:07:50.176 | 20250520/us-east-1/logs/aws4_request
   2025-05-20 11:07:50.176 | 
00ae7bb1034a2cb70638504c146fe57ebc5ae932392da691c48de3bfb817338d [botocore.auth]
   2025-05-20 11:07:50.176 | 2025-05-20 15:07:50.176518 [debug    ] Signature:
   2025-05-20 11:07:50.176 | 
013ec430e100c2896ef30775e56fb170d68f669f9213ac5a86050a58da7ed4ee [botocore.auth]
   2025-05-20 11:07:50.176 | 2025-05-20 15:07:50.176563 [debug    ] Event 
request-created.cloudwatch-logs.CreateLogStream: calling handler <function 
add_retry_headers at 0xffffb56b6e80> [botocore.hooks]
   2025-05-20 11:07:50.176 | 2025-05-20 15:07:50.176618 [debug    ] Sending 
http request: <AWSPreparedRequest stream_output=False, method=POST, 
url=https://logs.us-east-1.amazonaws.com/, headers={'X-Amz-Target': 
b'Logs_20140328.CreateLogStream', 'Content-Type': 
b'application/x-amz-json-1.1', 'User-Agent': b'Boto3/1.37.3 md/Botocore#1.37.3 
ua/2.0 os/linux#6.10.14-linuxkit md/arch#aarch64 lang/python#3.12.10 
md/pyimpl#CPython cfg/retry-mode#legacy Botocore/1.37.3 Airflow/3.0.1 
AmPP/9.7.0 Caller/Unknown DagRunKey/00000000-0000-0000-0000-000000000000', 
'X-Amz-Date': b'20250520T150750Z', 'Authorization': b'AWS4-HMAC-SHA256 
Credential=AKIA5KRESCYYYRXCTOOL/20250520/us-east-1/logs/aws4_request, 
SignedHeaders=content-type;host;x-amz-date;x-amz-target, 
Signature=013ec430e100c2896ef30775e56fb170d68f669f9213ac5a86050a58da7ed4ee', 
'amz-sdk-invocation-id': b'b73a2917-85e5-4585-890f-b4e9b5129245', 
'amz-sdk-request': b'attempt=1', 'Content-Length': '149'}> [botocore.endpoint]
   2025-05-20 11:07:50.176 | 2025-05-20 15:07:50.176739 [debug    ] Certificate 
path: /home/airflow/.local/lib/python3.12/site-packages/certifi/cacert.pem 
[botocore.httpsession]
   2025-05-20 11:07:50.217 | 2025-05-20 15:07:50.217464 [debug    ] 
https://logs.us-east-1.amazonaws.com:443 "POST / HTTP/1.1" 200 0 
[urllib3.connectionpool]
   2025-05-20 11:07:50.217 | 2025-05-20 15:07:50.217688 [debug    ] Response 
headers: {'x-amzn-RequestId': 'ed4fd123-39c4-48b7-bfa9-7a85eb9f314d', 
'Content-Type': 'application/x-amz-json-1.1', 'Content-Length': '0', 'Date': 
'Tue, 20 May 2025 15:07:50 GMT'} [botocore.parsers]
   2025-05-20 11:07:50.217 | 2025-05-20 15:07:50.217740 [debug    ] Response 
body:
   2025-05-20 11:07:50.217 | b''             [botocore.parsers]
   2025-05-20 11:07:50.217 | 2025-05-20 15:07:50.217854 [debug    ] Event 
needs-retry.cloudwatch-logs.CreateLogStream: calling handler 
<botocore.retryhandler.RetryHandler object at 0xffffb01cb0e0> [botocore.hooks]
   2025-05-20 11:07:50.218 | 2025-05-20 15:07:50.217917 [debug    ] No retry 
needed.               [botocore.retryhandler]
   2025-05-20 11:07:50.218 | 2025-05-20 15:07:50.218044 [debug    ] Event 
before-parameter-build.cloudwatch-logs.PutLogEvents: calling handler <function 
generate_idempotent_uuid at 0xffffb56b4b80> [botocore.hooks]
   2025-05-20 11:07:50.218 | 2025-05-20 15:07:50.218087 [debug    ] Event 
before-parameter-build.cloudwatch-logs.PutLogEvents: calling handler <function 
_handle_request_validation_mode_member at 0xffffb56b76a0> [botocore.hooks]
   2025-05-20 11:07:50.218 | 2025-05-20 15:07:50.218138 [debug    ] Calling 
endpoint provider with parameters: {'Region': 'us-east-1', 'UseDualStack': 
False, 'UseFIPS': False} [botocore.regions]
   2025-05-20 11:07:50.218 | 2025-05-20 15:07:50.218177 [debug    ] Endpoint 
provider result: https://logs.us-east-1.amazonaws.com [botocore.regions]
   2025-05-20 11:07:50.218 | 2025-05-20 15:07:50.218455 [debug    ] Event 
before-call.cloudwatch-logs.PutLogEvents: calling handler <function 
add_recursion_detection_header at 0xffffb5697a60> [botocore.hooks]
   2025-05-20 11:07:50.218 | 2025-05-20 15:07:50.218502 [debug    ] Event 
before-call.cloudwatch-logs.PutLogEvents: calling handler <function 
add_query_compatibility_header at 0xffffb56b7600> [botocore.hooks]
   2025-05-20 11:07:50.218 | 2025-05-20 15:07:50.218538 [debug    ] Event 
before-call.cloudwatch-logs.PutLogEvents: calling handler <function 
inject_api_version_header_if_needed at 0xffffb56b6660> [botocore.hooks]
   2025-05-20 11:07:50.218 | 2025-05-20 15:07:50.218607 [debug    ] Making 
request for OperationModel(name=PutLogEvents) with params: {'url_path': '/', 
'query_string': '', 'method': 'POST', 'headers': {'X-Amz-Target': 
'Logs_20140328.PutLogEvents', 'Content-Type': 'application/x-amz-json-1.1', 
'User-Agent': 'Boto3/1.37.3 md/Botocore#1.37.3 ua/2.0 os/linux#6.10.14-linuxkit 
md/arch#aarch64 lang/python#3.12.10 md/pyimpl#CPython cfg/retry-mode#legacy 
Botocore/1.37.3 Airflow/3.0.1 AmPP/9.7.0 Caller/Unknown 
DagRunKey/00000000-0000-0000-0000-000000000000'}, 'body': b'{"logGroupName": 
"REDACTED", "logStreamName": 
"dag_id=test_dag/run_id=manual__2025-05-20T15_07_42.882881+00_00/task_id=test/attempt=1.log",
 "logEvents": [{"timestamp": 1747753669872, "message": "{\\"logger\\": 
\\"airflow.plugins_manager\\", \\"event\\": \\"Loading plugins\\", \\"level\\": 
\\"debug\\"}"}, {"timestamp": 1747753669879, "message": "{\\"logger\\": 
\\"airflow.plugins_manager\\", \\"event\\": \\"Loading plugins from di
 rectory: /opt/airflow/plugins\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753669879, "message": "{\\"logger\\": \\"airflow.plugins_manager\\", 
\\"event\\": \\"Loading plugins from entrypoints\\", \\"level\\": 
\\"debug\\"}"}, {"timestamp": 1747753669879, "message": "{\\"logger\\": 
\\"airflow.plugins_manager\\", \\"event\\": \\"Importing entry_point plugin 
openlineage\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 1747753669942, 
"message": "{\\"logger\\": \\"airflow.plugins_manager\\", \\"event\\": 
\\"Loading 1 plugin(s) took 70.19 seconds\\", \\"level\\": \\"debug\\"}"}, 
{"timestamp": 1747753669942, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Calling \'on_starting\' with 
{\'component\': <airflow.sdk.execution_time.task_runner.TaskRunnerMarker object 
at 0xffff8a218d70>}\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753669942, "message": "{\\"logger\\": \\"airflow.listeners.listener\\", 
\\"event\\": \\"Hook impls: []\\", \\"level\\": \\"debu
 g\\"}"}, {"timestamp": 1747753669942, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Result from \'on_starting\': 
[]\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 1747753669943, "message": 
"{\\"logger\\": \\"airflow.dag_processing.bundles.manager.DagBundlesManager\\", 
\\"event\\": \\"DAG bundles loaded: dags-folder\\", \\"level\\": \\"info\\"}"}, 
{"timestamp": 1747753669943, "message": "{\\"logger\\": 
\\"airflow.models.dagbag.DagBag\\", \\"event\\": \\"Filling up the DagBag from 
/opt/airflow/dags/test.py\\", \\"level\\": \\"info\\"}"}, {"timestamp": 
1747753669943, "message": "{\\"logger\\": \\"airflow.models.dagbag.DagBag\\", 
\\"event\\": \\"Importing /opt/airflow/dags/test.py\\", \\"level\\": 
\\"debug\\"}"}, {"timestamp": 1747753669956, "message": "{\\"logger\\": 
\\"airflow.providers_manager\\", \\"event\\": \\"Initializing Providers 
Manager[hook_lineage_writers]\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753669956, "message": "{\\"logger\\": 
 \\"airflow.providers_manager\\", \\"event\\": \\"Initializing Providers 
Manager[taskflow_decorators]\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753669956, "message": "{\\"logger\\": \\"airflow.providers_manager\\", 
\\"event\\": \\"Initialization of Providers Manager[taskflow_decorators] took 
0.00 seconds\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 1747753669956, 
"message": "{\\"logger\\": \\"airflow.providers_manager\\", \\"event\\": 
\\"Initialization of Providers Manager[hook_lineage_writers] took 0.00 
seconds\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 1747753669956, 
"message": "{\\"logger\\": \\"airflow.models.dagbag.DagBag\\", \\"event\\": 
\\"Loaded DAG <DAG: test_dag>\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753669956, "message": "{\\"file\\": \\"test.py\\", \\"logger\\": 
\\"task\\", \\"event\\": \\"DAG file parsed\\", \\"level\\": \\"debug\\"}"}, 
{"timestamp": 1747753669957, "message": "{\\"json\\": 
\\"{\\\\\\"rendered_fields\\\\\\":{\\\\\\"templates_di
 
ct\\\\\\":null,\\\\\\"op_args\\\\\\":[],\\\\\\"op_kwargs\\\\\\":{}},\\\\\\"type\\\\\\":\\\\\\"SetRenderedFields\\\\\\"}\\\\n\\",
 \\"logger\\": \\"task\\", \\"event\\": \\"Sending request\\", \\"level\\": 
\\"debug\\"}"}, {"timestamp": 1747753670123, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Calling 
\'on_task_instance_running\' with {\'previous_state\': 
<TaskInstanceState.QUEUED: \'queued\'>, \'task_instance\': 
RuntimeTaskInstance(id=UUID(\'0196ee3c-79aa-7f8f-aeda-ef09386fc62d\'), 
task_id=\'test\', dag_id=\'test_dag\', 
run_id=\'manual__2025-05-20T15:07:42.882881+00:00\', try_number=1, 
map_index=-1, hostname=\'5f1d5b3f2bb4\', context_carrier={}, 
task=<Task(_PythonDecoratedOperator): test>, 
bundle_instance=LocalDagBundle(name=dags-folder), max_tries=0, 
start_date=datetime.datetime(2025, 5, 20, 15, 7, 49, 722156, 
tzinfo=TzInfo(UTC)), end_date=None, state=<TaskInstanceState.RUNNING: 
\'running\'>, is_mapped=False, rendered_map_index=None)}\\", \\"level\\"
 : \\"debug\\"}"}, {"timestamp": 1747753670123, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Hook impls: []\\", 
\\"level\\": \\"debug\\"}"}, {"timestamp": 1747753670123, "message": 
"{\\"logger\\": \\"airflow.listeners.listener\\", \\"event\\": \\"Result from 
\'on_task_instance_running\': []\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753670123, "message": "{\\"logger\\": 
\\"unusual_prefix_7e98759859e7160b3f1504a7fe18f084e0be7596_test\\", 
\\"event\\": \\"Hello world!\\", \\"level\\": \\"info\\"}"}, {"timestamp": 
1747753670123, "message": "{\\"logger\\": 
\\"airflow.task.operators.airflow.providers.standard.decorators.python._PythonDecoratedOperator\\",
 \\"event\\": \\"Done. Returned value was: None\\", \\"level\\": \\"info\\"}"}, 
{"timestamp": 1747753670124, "message": "{\\"json\\": 
\\"{\\\\\\"state\\\\\\":\\\\\\"success\\\\\\",\\\\\\"end_date\\\\\\":\\\\\\"2025-05-20T15:07:49.958659Z\\\\\\",\\\\\\"task_outlets\\\\\\":[],\\\\\\"outlet_events\\\\\\
 
":[],\\\\\\"rendered_map_index\\\\\\":null,\\\\\\"type\\\\\\":\\\\\\"SucceedTask\\\\\\"}\\\\n\\",
 \\"logger\\": \\"task\\", \\"event\\": \\"Sending request\\", \\"level\\": 
\\"debug\\"}"}, {"timestamp": 1747753670124, "message": "{\\"ti\\": 
\\"RuntimeTaskInstance(id=UUID(\'0196ee3c-79aa-7f8f-aeda-ef09386fc62d\'), 
task_id=\'test\', dag_id=\'test_dag\', 
run_id=\'manual__2025-05-20T15:07:42.882881+00:00\', try_number=1, 
map_index=-1, hostname=\'5f1d5b3f2bb4\', context_carrier={}, 
task=<Task(_PythonDecoratedOperator): test>, 
bundle_instance=LocalDagBundle(name=dags-folder), max_tries=0, 
start_date=datetime.datetime(2025, 5, 20, 15, 7, 49, 722156, 
tzinfo=TzInfo(UTC)), end_date=None, state=<TaskInstanceState.SUCCESS: 
\'success\'>, is_mapped=False, rendered_map_index=None)\\", \\"logger\\": 
\\"task\\", \\"event\\": \\"Running finalizers\\", \\"level\\": \\"debug\\"}"}, 
{"timestamp": 1747753670124, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Calling \'on_tas
 k_instance_success\' with {\'previous_state\': <TaskInstanceState.RUNNING: 
\'running\'>, \'task_instance\': 
RuntimeTaskInstance(id=UUID(\'0196ee3c-79aa-7f8f-aeda-ef09386fc62d\'), 
task_id=\'test\', dag_id=\'test_dag\', 
run_id=\'manual__2025-05-20T15:07:42.882881+00:00\', try_number=1, 
map_index=-1, hostname=\'5f1d5b3f2bb4\', context_carrier={}, 
task=<Task(_PythonDecoratedOperator): test>, 
bundle_instance=LocalDagBundle(name=dags-folder), max_tries=0, 
start_date=datetime.datetime(2025, 5, 20, 15, 7, 49, 722156, 
tzinfo=TzInfo(UTC)), end_date=None, state=<TaskInstanceState.SUCCESS: 
\'success\'>, is_mapped=False, rendered_map_index=None)}\\", \\"level\\": 
\\"debug\\"}"}, {"timestamp": 1747753670124, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Hook impls: []\\", 
\\"level\\": \\"debug\\"}"}, {"timestamp": 1747753670124, "message": 
"{\\"logger\\": \\"airflow.listeners.listener\\", \\"event\\": \\"Result from 
\'on_task_instance_success\': []\\", \\"level\\": 
 \\"debug\\"}"}, {"timestamp": 1747753670124, "message": "{\\"logger\\": 
\\"airflow.listeners.listener\\", \\"event\\": \\"Calling \'before_stopping\' 
with {\'component\': <airflow.sdk.execution_time.task_runner.TaskRunnerMarker 
object at 0xffff8a16e060>}\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753670124, "message": "{\\"logger\\": \\"airflow.listeners.listener\\", 
\\"event\\": \\"Hook impls: []\\", \\"level\\": \\"debug\\"}"}, {"timestamp": 
1747753670124, "message": "{\\"logger\\": \\"airflow.listeners.listener\\", 
\\"event\\": \\"Result from \'before_stopping\': []\\", \\"level\\": 
\\"debug\\"}"}]}', 'url': 'https://logs.us-east-1.amazonaws.com/', 'context': 
{'client_region': 'us-east-1', 'client_config': <botocore.config.Config object 
at 0xffffafeb6870>, 'has_streaming_input': False, 'auth_type': None, 
'unsigned_payload': None}} [botocore.endpoint]
   2025-05-20 11:07:50.219 | 2025-05-20 15:07:50.218688 [debug    ] Event 
request-created.cloudwatch-logs.PutLogEvents: calling handler <bound method 
RequestSigner.handler of <botocore.signers.RequestSigner object at 
0xffffafeb7ec0>> [botocore.hooks]
   2025-05-20 11:07:50.219 | 2025-05-20 15:07:50.218719 [debug    ] Event 
choose-signer.cloudwatch-logs.PutLogEvents: calling handler <function 
set_operation_specific_signer at 0xffffb56b49a0> [botocore.hooks]
   2025-05-20 11:07:50.219 | 2025-05-20 15:07:50.218848 [debug    ] Calculating 
signature using v4 auth. [botocore.auth]
   2025-05-20 11:07:50.219 | 2025-05-20 15:07:50.218899 [debug    ] 
CanonicalRequest:
   2025-05-20 11:07:50.219 | POST
   2025-05-20 11:07:50.219 | /
   2025-05-20 11:07:50.219 | 
   2025-05-20 11:07:50.219 | content-type:application/x-amz-json-1.1
   2025-05-20 11:07:50.219 | host:logs.us-east-1.amazonaws.com
   2025-05-20 11:07:50.219 | x-amz-date:20250520T150750Z
   2025-05-20 11:07:50.219 | x-amz-target:Logs_20140328.PutLogEvents
   2025-05-20 11:07:50.219 | 
   2025-05-20 11:07:50.219 | content-type;host;x-amz-date;x-amz-target
   2025-05-20 11:07:50.219 | 
e35488513a321e511b829d7b82549579804a2c94566f8c55a40f09d6020eb6d4 [botocore.auth]
   2025-05-20 11:07:50.219 | 2025-05-20 15:07:50.218929 [debug    ] 
StringToSign:
   2025-05-20 11:07:50.219 | AWS4-HMAC-SHA256
   2025-05-20 11:07:50.219 | 20250520T150750Z
   2025-05-20 11:07:50.219 | 20250520/us-east-1/logs/aws4_request
   2025-05-20 11:07:50.219 | 
841af5818d6fb3afcecf31d712fa5ff9ade325156511d4cab25c7efc7a1db924 [botocore.auth]
   2025-05-20 11:07:50.219 | 2025-05-20 15:07:50.218988 [debug    ] Signature:
   2025-05-20 11:07:50.219 | 
b7d66b01b3dd811462dae33a17b70e31630608d73a77b4a1b1ad2108f04d3d80 [botocore.auth]
   2025-05-20 11:07:50.219 | 2025-05-20 15:07:50.219039 [debug    ] Event 
request-created.cloudwatch-logs.PutLogEvents: calling handler <function 
add_retry_headers at 0xffffb56b6e80> [botocore.hooks]
   2025-05-20 11:07:50.219 | 2025-05-20 15:07:50.219092 [debug    ] Sending 
http request: <AWSPreparedRequest stream_output=False, method=POST, 
url=https://logs.us-east-1.amazonaws.com/, headers={'X-Amz-Target': 
b'Logs_20140328.PutLogEvents', 'Content-Type': b'application/x-amz-json-1.1', 
'User-Agent': b'Boto3/1.37.3 md/Botocore#1.37.3 ua/2.0 
os/linux#6.10.14-linuxkit md/arch#aarch64 lang/python#3.12.10 md/pyimpl#CPython 
cfg/retry-mode#legacy Botocore/1.37.3 Airflow/3.0.1 AmPP/9.7.0 Caller/Unknown 
DagRunKey/00000000-0000-0000-0000-000000000000', 'X-Amz-Date': 
b'20250520T150750Z', 'Authorization': b'AWS4-HMAC-SHA256 
Credential=AKIA5KRESCYYYRXCTOOL/20250520/us-east-1/logs/aws4_request, 
SignedHeaders=content-type;host;x-amz-date;x-amz-target, 
Signature=b7d66b01b3dd811462dae33a17b70e31630608d73a77b4a1b1ad2108f04d3d80', 
'amz-sdk-invocation-id': b'4409d7e3-fc91-4ddf-b7c6-3b1ca7a5f373', 
'amz-sdk-request': b'attempt=1', 'Content-Length': '7488'}> [botocore.endpoint]
   2025-05-20 11:07:50.219 | 2025-05-20 15:07:50.219188 [debug    ] Certificate 
path: /home/airflow/.local/lib/python3.12/site-packages/certifi/cacert.pem 
[botocore.httpsession]
   2025-05-20 11:07:50.265 | 2025-05-20 15:07:50.265456 [debug    ] 
https://logs.us-east-1.amazonaws.com:443 "POST / HTTP/1.1" 200 80 
[urllib3.connectionpool]
   2025-05-20 11:07:50.265 | 2025-05-20 15:07:50.265647 [debug    ] Response 
headers: {'x-amzn-RequestId': 'baa2e774-456a-4daf-8ebd-14df86704b84', 
'Content-Type': 'application/x-amz-json-1.1', 'Content-Length': '80', 'Date': 
'Tue, 20 May 2025 15:07:50 GMT'} [botocore.parsers]
   2025-05-20 11:07:50.265 | 2025-05-20 15:07:50.265705 [debug    ] Response 
body:
   2025-05-20 11:07:50.265 | 
b'{"nextSequenceToken":"49659153000038441045478490166319604883260026562090633026"}'
 [botocore.parsers]
   2025-05-20 11:07:50.265 | 2025-05-20 15:07:50.265798 [debug    ] Event 
needs-retry.cloudwatch-logs.PutLogEvents: calling handler 
<botocore.retryhandler.RetryHandler object at 0xffffb01cb0e0> [botocore.hooks]
   2025-05-20 11:07:50.266 | 2025-05-20 15:07:50.265851 [debug    ] No retry 
needed.               [botocore.retryhandler]
   2025-05-20 11:07:50.266 | 2025-05-20 15:07:50.266101 [info     ] Task 
finished                  [supervisor] duration=0.5486190420000412 exit_code=0 
final_state=success
   2025-05-20 11:07:50.269 | 2025-05-20 15:07:50.269511 [info     ] Task 
execute_workload[923401f6-b277-412f-81dd-b468e0c9ff16] succeeded in 
0.5968052089999674s: None [celery.app.trace]
   2025-05-20 11:07:53.606 | 2025-05-20 15:07:53.606330 [debug    ] pidbox 
received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 
'routing_key': '22daf8f2-e6b1-3770-a567-881ab9da4167'} 
ticket:6ec5e7f6-f12f-421f-a56e-43dd9404c675] [kombu.pidbox]
   2025-05-20 11:07:58.660 | 2025-05-20 15:07:58.660258 [info     ] Task 
execute_workload[75db0d4c-dfb7-474b-bc27-5c7153b5a6d7] received 
[celery.worker.strategy]
   2025-05-20 11:07:58.660 | 2025-05-20 15:07:58.660680 [debug    ] TaskPool: 
Apply <function fast_trace_task at 0xffffaefb91c0> (args:('execute_workload', 
'75db0d4c-dfb7-474b-bc27-5c7153b5a6d7', {'lang': 'py', 'task': 
'execute_workload', 'id': '75db0d4c-dfb7-474b-bc27-5c7153b5a6d7', 'shadow': 
None, 'eta': None, 'expires': None, 'group': None, 'group_index': None, 
'retries': 0, 'timelimit': [None, None], 'root_id': 
'75db0d4c-dfb7-474b-bc27-5c7153b5a6d7', 'parent_id': None, 'argsrepr':... 
kwargs:{}) [celery.pool]
   2025-05-20 11:07:58.672 | 2025-05-20 15:07:58.672644 [info     ] 
[75db0d4c-dfb7-474b-bc27-5c7153b5a6d7] Executing workload in Celery: 
token='eyJ***' ti=TaskInstance(id=UUID('0196ee3c-b6db-77fe-b4ce-6a4b00ff8a47'), 
task_id='test', dag_id='test_dag', 
run_id='manual__2025-05-20T15:07:58.546809+00:00', try_number=1, map_index=-1, 
pool_slots=1, queue='default', priority_weight=1, executor_config=None, 
parent_context_carrier={}, context_carrier={}, queued_dttm=None) 
dag_rel_path=PurePosixPath('test.py') 
bundle_info=BundleInfo(name='dags-folder', version=None) 
log_path='dag_id=test_dag/run_id=manual__2025-05-20T15:07:58.546809+00:00/task_id=test/attempt=1.log'
 type='ExecuteTask' [airflow.providers.celery.executors.celery_executor_utils]
   2025-05-20 11:07:58.696 | 2025-05-20 15:07:58.696118 [info     ] Secrets 
backends loaded for worker [supervisor] 
backend_classes=['EnvironmentVariablesBackend'] count=1
   2025-05-20 11:07:58.708 | 2025-05-20 15:07:58.708616 [debug    ] 
connect_tcp.started host='airflow-apiserver' port=8080 local_address=None 
timeout=5.0 socket_options=None [httpcore.connection]
   2025-05-20 11:07:58.709 | 2025-05-20 15:07:58.709595 [debug    ] 
connect_tcp.complete return_value=<httpcore._backends.sync.SyncStream object at 
0xffff8a18deb0> [httpcore.connection]
   2025-05-20 11:07:58.709 | 2025-05-20 15:07:58.709836 [debug    ] 
send_request_headers.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:58.710 | 2025-05-20 15:07:58.710247 [debug    ] 
send_request_headers.complete  [httpcore.http11]
   2025-05-20 11:07:58.710 | 2025-05-20 15:07:58.710315 [debug    ] 
send_request_body.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:58.710 | 2025-05-20 15:07:58.710441 [debug    ] 
send_request_body.complete     [httpcore.http11]
   2025-05-20 11:07:58.710 | 2025-05-20 15:07:58.710539 [debug    ] 
receive_response_headers.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:58.724 | 2025-05-20 15:07:58.724401 [debug    ] 
receive_response_headers.complete return_value=(b'HTTP/1.1', 200, b'OK', 
[(b'date', b'Tue, 20 May 2025 15:07:57 GMT'), (b'server', b'uvicorn'), 
(b'content-type', b'application/json'), (b'airflow-api-version', 
b'2025-04-28'), (b'vary', b'Accept-Encoding'), (b'content-encoding', b'gzip'), 
(b'transfer-encoding', b'chunked')]) [httpcore.http11]
   2025-05-20 11:07:58.728 | 2025-05-20 15:07:58.728282 [debug    ] 
receive_response_body.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:58.728 | 2025-05-20 15:07:58.728541 [debug    ] 
receive_response_body.complete [httpcore.http11]
   2025-05-20 11:07:58.728 | 2025-05-20 15:07:58.728619 [debug    ] 
response_closed.started        [httpcore.http11]
   2025-05-20 11:07:58.728 | 2025-05-20 15:07:58.728682 [debug    ] 
response_closed.complete       [httpcore.http11]
   2025-05-20 11:07:58.729 | 2025-05-20 15:07:58.729201 [debug    ] Sending     
                   [supervisor] 
msg=StartupDetails(ti=TaskInstance(id=UUID('0196ee3c-b6db-77fe-b4ce-6a4b00ff8a47'),
 task_id='test', dag_id='test_dag', 
run_id='manual__2025-05-20T15:07:58.546809+00:00', try_number=1, map_index=-1, 
pool_slots=1, queue='default', priority_weight=1, executor_config=None, 
parent_context_carrier={}, context_carrier={}, queued_dttm=None), 
dag_rel_path='test.py', bundle_info=BundleInfo(name='dags-folder', 
version=None), requests_fd=102, start_date=datetime.datetime(2025, 5, 20, 15, 
7, 58, 706653, tzinfo=datetime.timezone.utc), 
ti_context=TIRunContext(dag_run=DagRun(dag_id='test_dag', 
run_id='manual__2025-05-20T15:07:58.546809+00:00', 
logical_date=datetime.datetime(2025, 5, 20, 15, 7, 57, 741000, 
tzinfo=TzInfo(UTC)), data_interval_start=datetime.datetime(2025, 5, 20, 15, 7, 
57, 741000, tzinfo=TzInfo(UTC)), data_interval_end=datetime.datetime(2025, 5, 
20, 15, 7, 57, 741000, tzinfo=
 TzInfo(UTC)), run_after=datetime.datetime(2025, 5, 20, 15, 7, 57, 741000, 
tzinfo=TzInfo(UTC)), start_date=datetime.datetime(2025, 5, 20, 15, 7, 58, 
628521, tzinfo=TzInfo(UTC)), end_date=None, clear_number=0, 
run_type=<DagRunType.MANUAL: 'manual'>, conf={}, consumed_asset_events=[]), 
task_reschedule_count=0, max_tries=0, variables=[], connections=[], 
upstream_map_indexes={}, next_method=None, next_kwargs=None, 
xcom_keys_to_clear=[], should_retry=False), type='StartupDetails')
   2025-05-20 11:07:58.746 | 2025-05-20 15:07:58.746784 [warning  ] 
/home/airflow/.local/lib/python3.12/site-packages/watchtower/__init__.py:464: 
WatchtowerWarning: Received message after logging system shutdown
   2025-05-20 11:07:58.746 |   warnings.warn("Received message after logging 
system shutdown", WatchtowerWarning)
   2025-05-20 11:07:58.746 |  [py.warnings]
   2025-05-20 11:07:58.747 | 2025-05-20 15:07:58.747339 [debug    ] 
send_request_headers.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:58.747 | 2025-05-20 15:07:58.747506 [debug    ] 
send_request_headers.complete  [httpcore.http11]
   2025-05-20 11:07:58.747 | 2025-05-20 15:07:58.747553 [debug    ] 
send_request_body.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:58.747 | 2025-05-20 15:07:58.747707 [debug    ] 
send_request_body.complete     [httpcore.http11]
   2025-05-20 11:07:58.747 | 2025-05-20 15:07:58.747757 [debug    ] 
receive_response_headers.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:58.751 | 2025-05-20 15:07:58.751608 [debug    ] 
receive_response_headers.complete return_value=(b'HTTP/1.1', 204, b'No 
Content', [(b'date', b'Tue, 20 May 2025 15:07:57 GMT'), (b'server', 
b'uvicorn'), (b'content-type', b'application/json'), (b'airflow-api-version', 
b'2025-04-28')]) [httpcore.http11]
   2025-05-20 11:07:58.751 | 2025-05-20 15:07:58.751797 [debug    ] 
receive_response_body.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:58.751 | 2025-05-20 15:07:58.751850 [debug    ] 
receive_response_body.complete [httpcore.http11]
   2025-05-20 11:07:58.751 | 2025-05-20 15:07:58.751891 [debug    ] 
response_closed.started        [httpcore.http11]
   2025-05-20 11:07:58.752 | 2025-05-20 15:07:58.751935 [debug    ] 
response_closed.complete       [httpcore.http11]
   2025-05-20 11:07:58.819 | 2025-05-20 15:07:58.819136 [debug    ] Received 
message from task runner [supervisor] 
msg=SetRenderedFields(rendered_fields={'templates_dict': None, 'op_args': [], 
'op_kwargs': {}}, type='SetRenderedFields')
   2025-05-20 11:07:58.819 | 2025-05-20 15:07:58.819550 [debug    ] 
send_request_headers.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:58.819 | 2025-05-20 15:07:58.819713 [debug    ] 
send_request_headers.complete  [httpcore.http11]
   2025-05-20 11:07:58.819 | 2025-05-20 15:07:58.819759 [debug    ] 
send_request_body.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:58.819 | 2025-05-20 15:07:58.819815 [debug    ] 
send_request_body.complete     [httpcore.http11]
   2025-05-20 11:07:58.819 | 2025-05-20 15:07:58.819849 [debug    ] 
receive_response_headers.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:58.828 | 2025-05-20 15:07:58.827942 [debug    ] 
receive_response_headers.complete return_value=(b'HTTP/1.1', 201, b'Created', 
[(b'date', b'Tue, 20 May 2025 15:07:57 GMT'), (b'server', b'uvicorn'), 
(b'content-type', b'application/json'), (b'airflow-api-version', 
b'2025-04-28'), (b'vary', b'Accept-Encoding'), (b'content-encoding', b'gzip'), 
(b'transfer-encoding', b'chunked')]) [httpcore.http11]
   2025-05-20 11:07:58.828 | 2025-05-20 15:07:58.828188 [debug    ] 
receive_response_body.started request=<Request [b'PUT']> [httpcore.http11]
   2025-05-20 11:07:58.828 | 2025-05-20 15:07:58.828291 [debug    ] 
receive_response_body.complete [httpcore.http11]
   2025-05-20 11:07:58.828 | 2025-05-20 15:07:58.828339 [debug    ] 
response_closed.started        [httpcore.http11]
   2025-05-20 11:07:58.828 | 2025-05-20 15:07:58.828376 [debug    ] 
response_closed.complete       [httpcore.http11]
   2025-05-20 11:07:58.829 | 2025-05-20 15:07:58.829808 [debug    ] 
send_request_headers.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:58.829 | 2025-05-20 15:07:58.829439 [debug    ] Received 
message from task runner [supervisor] msg=SucceedTask(state='success', 
end_date=datetime.datetime(2025, 5, 20, 15, 7, 58, 819799, tzinfo=TzInfo(UTC)), 
task_outlets=[], outlet_events=[], rendered_map_index=None, type='SucceedTask')
   2025-05-20 11:07:58.830 | 2025-05-20 15:07:58.829912 [debug    ] 
send_request_headers.complete  [httpcore.http11]
   2025-05-20 11:07:58.830 | 2025-05-20 15:07:58.829952 [debug    ] 
send_request_body.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:58.830 | 2025-05-20 15:07:58.830035 [debug    ] 
send_request_body.complete     [httpcore.http11]
   2025-05-20 11:07:58.830 | 2025-05-20 15:07:58.830120 [debug    ] 
receive_response_headers.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:58.839 | 2025-05-20 15:07:58.839856 [debug    ] 
receive_response_headers.complete return_value=(b'HTTP/1.1', 204, b'No 
Content', [(b'date', b'Tue, 20 May 2025 15:07:57 GMT'), (b'server', 
b'uvicorn'), (b'content-type', b'application/json'), (b'airflow-api-version', 
b'2025-04-28')]) [httpcore.http11]
   2025-05-20 11:07:58.840 | 2025-05-20 15:07:58.840124 [debug    ] 
receive_response_body.started request=<Request [b'PATCH']> [httpcore.http11]
   2025-05-20 11:07:58.840 | 2025-05-20 15:07:58.840184 [debug    ] 
receive_response_body.complete [httpcore.http11]
   2025-05-20 11:07:58.840 | 2025-05-20 15:07:58.840226 [debug    ] 
response_closed.started        [httpcore.http11]
   2025-05-20 11:07:58.840 | 2025-05-20 15:07:58.840261 [debug    ] 
response_closed.complete       [httpcore.http11]
   2025-05-20 11:07:58.840 | 2025-05-20 15:07:58.840614 [info     ] Task 
finished                  [supervisor] duration=0.14550216700001783 exit_code=0 
final_state=success
   2025-05-20 11:07:58.845 | 2025-05-20 15:07:58.845742 [info     ] Task 
execute_workload[75db0d4c-dfb7-474b-bc27-5c7153b5a6d7] succeeded in 
0.18332333399996514s: None [celery.app.trace]
   2025-05-20 11:08:04.710 | 
   2025-05-20 11:08:04.710 | worker: Warm shutdown (MainProcess)
   2025-05-20 11:08:04.871 | 2025-05-20 15:08:04.871226 [debug    ] | Worker: 
Closing Hub...       [celery.bootsteps]
   2025-05-20 11:08:04.871 | 2025-05-20 15:08:04.871427 [debug    ] | Worker: 
Closing Pool...      [celery.bootsteps]
   2025-05-20 11:08:04.871 | 2025-05-20 15:08:04.871775 [debug    ] | Worker: 
Closing Consumer...  [celery.bootsteps]
   2025-05-20 11:08:04.871 | 2025-05-20 15:08:04.871860 [debug    ] | Worker: 
Stopping Consumer... [celery.bootsteps]
   2025-05-20 11:08:04.872 | 2025-05-20 15:08:04.872193 [debug    ] | Consumer: 
Closing Connection... [celery.bootsteps]
   2025-05-20 11:08:04.872 | 2025-05-20 15:08:04.872303 [debug    ] | Consumer: 
Closing Events...  [celery.bootsteps]
   2025-05-20 11:08:04.872 | 2025-05-20 15:08:04.872361 [debug    ] | Consumer: 
Closing Heart...   [celery.bootsteps]
   2025-05-20 11:08:04.872 | 2025-05-20 15:08:04.872578 [debug    ] | Consumer: 
Closing Mingle...  [celery.bootsteps]
   2025-05-20 11:08:04.872 | 2025-05-20 15:08:04.872744 [debug    ] | Consumer: 
Closing Gossip...  [celery.bootsteps]
   2025-05-20 11:08:04.873 | 2025-05-20 15:08:04.872917 [debug    ] | Consumer: 
Closing Tasks...   [celery.bootsteps]
   2025-05-20 11:08:04.873 | 2025-05-20 15:08:04.873062 [debug    ] | Consumer: 
Closing Control... [celery.bootsteps]
   2025-05-20 11:08:04.873 | 2025-05-20 15:08:04.873239 [debug    ] | Consumer: 
Closing event loop... [celery.bootsteps]
   2025-05-20 11:08:04.873 | 2025-05-20 15:08:04.873308 [debug    ] | Consumer: 
Stopping event loop... [celery.bootsteps]
   2025-05-20 11:08:04.873 | 2025-05-20 15:08:04.873472 [debug    ] | Consumer: 
Stopping Control... [celery.bootsteps]
   2025-05-20 11:08:04.877 | 2025-05-20 15:08:04.876854 [debug    ] | Consumer: 
Stopping Tasks...  [celery.bootsteps]
   2025-05-20 11:08:04.877 | 2025-05-20 15:08:04.877093 [debug    ] Canceling 
task consumer...     [celery.worker.consumer.tasks]
   2025-05-20 11:08:04.877 | 2025-05-20 15:08:04.877265 [debug    ] | Consumer: 
Stopping Gossip... [celery.bootsteps]
   2025-05-20 11:08:04.878 | 2025-05-20 15:08:04.878535 [debug    ] | Consumer: 
Stopping Mingle... [celery.bootsteps]
   2025-05-20 11:08:04.878 | 2025-05-20 15:08:04.878614 [debug    ] | Consumer: 
Stopping Heart...  [celery.bootsteps]
   2025-05-20 11:08:04.879 | 2025-05-20 15:08:04.879170 [debug    ] | Consumer: 
Stopping Events... [celery.bootsteps]
   2025-05-20 11:08:04.879 | 2025-05-20 15:08:04.879259 [debug    ] | Consumer: 
Stopping Connection... [celery.bootsteps]
   2025-05-20 11:08:04.879 | 2025-05-20 15:08:04.879420 [debug    ] | Worker: 
Stopping Pool...     [celery.bootsteps]
   2025-05-20 11:08:05.931 | 2025-05-20 15:08:05.931436 [debug    ] | Worker: 
Stopping Hub...      [celery.bootsteps]
   2025-05-20 11:08:05.931 | 2025-05-20 15:08:05.931613 [debug    ] | Consumer: 
Shutdown Control... [celery.bootsteps]
   2025-05-20 11:08:05.931 | 2025-05-20 15:08:05.931704 [debug    ] | Consumer: 
Shutdown Tasks...  [celery.bootsteps]
   2025-05-20 11:08:05.931 | 2025-05-20 15:08:05.931777 [debug    ] Canceling 
task consumer...     [celery.worker.consumer.tasks]
   2025-05-20 11:08:05.931 | 2025-05-20 15:08:05.931851 [debug    ] Closing 
consumer channel...    [celery.worker.consumer.tasks]
   2025-05-20 11:08:05.931 | 2025-05-20 15:08:05.931893 [debug    ] | Consumer: 
Shutdown Gossip... [celery.bootsteps]
   2025-05-20 11:08:05.932 | 2025-05-20 15:08:05.932053 [debug    ] | Consumer: 
Shutdown Heart...  [celery.bootsteps]
   2025-05-20 11:08:05.932 | 2025-05-20 15:08:05.932124 [debug    ] | Consumer: 
Shutdown Events... [celery.bootsteps]
   2025-05-20 11:08:05.932 | 2025-05-20 15:08:05.932392 [debug    ] | Consumer: 
Shutdown Connection... [celery.bootsteps]
   2025-05-20 11:08:05.934 | 2025-05-20 15:08:05.934379 [debug    ] Calling 
callbacks: []          [airflow.utils.cli_action_loggers]
   2025-05-20 11:08:05.935 | [2025-05-20 15:08:05 +0000] [24] [INFO] Handling 
signal: term
   2025-05-20 11:08:05.936 | [2025-05-20 15:08:05 +0000] [26] [INFO] Worker 
exiting (pid: 26)
   2025-05-20 11:08:05.936 | 2025-05-20 15:08:05.936244 [debug    ] removing 
tasks from inqueue until task handler finished [celery.concurrency.asynpool]
   2025-05-20 11:08:05.936 | [2025-05-20 15:08:05 +0000] [27] [INFO] Worker 
exiting (pid: 27)
   2025-05-20 11:08:06.042 | [2025-05-20 15:08:06 +0000] [24] [INFO] Shutting 
down: Master
   2025-05-20 11:08:06.444 | 2025-05-20 15:08:06.444383 [debug    ] Disposing 
DB connection pool (PID 7) [airflow.settings]
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to