chyzouakia opened a new issue, #55375:
URL: https://github.com/apache/airflow/issues/55375

   ### Apache Airflow Provider(s)
   
   amazon
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==8.29.0
   
   ### Apache Airflow version
   
   2.10.5
   
   ### Operating System
   
   Debian GNU/Linux 12 (bookworm)
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   Deployment through the Helm chart `https://airflow-helm.github.io/charts` 
version `8.9.0`.
   Airflow image : `slim-2.10.5-python3.10`
   My deployment creates a Service Account to allow to connect to our AWS S3 to 
push the logs there.
   
   Configuration for logging :
   AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER: "s3://{{ bucket }}/airflow/logs"
   AIRFLOW__LOGGING__REMOTE_LOGGING: "True"
   AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID: "s3_remote_logging"
   
   ### What happened
   
   It seems the `close()` method in `S3TaskHandler` is not called. I don't have 
any log stored in S3.
   I am able to read a file from S3 :
   
   <img width="909" height="325" alt="Image" 
src="https://github.com/user-attachments/assets/b442d462-8f7c-4cb7-9205-ff10b7d85f2d";
 />
   
   I am able to push a log from the worker manually by running this code :
   ```
   from airflow.providers.amazon.aws.hooks.s3 import S3Hook
   s3_hook=S3Hook(aws_conn_id="s3_remote_logging")
   import os
   import pathlib
   log = pathlib.Path([LOCAL_PATH]).read_text()
   s3_hook.load_string(log, key=[S3_KEY], replace=True, encrypt=False)
   ```
   
   ### What you think should happen instead
   
   When closing the task, The Task Handler should have pushed the logs to my S3 
bucket.
   
   ### How to reproduce
   
   * Run the Airflow image `slim-2.10.5-python3.10` with the amazon provider 
package
   * Configure Airflow to push the logs to S3 (the connection should be 
created) :
     * AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER: "s3://{{ bucket 
}}/airflow/logs"
     * AIRFLOW__LOGGING__REMOTE_LOGGING: "True"
     * AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID: "s3_remote_logging"
   * Run a task and see if the code pushes the logs to S3. 
   
   ### Anything else
   
   This problem is occurring on all my tasks now. Please let me know if I 
misconfigured something. Thanks !
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to