GitHub user samucabrave closed a discussion: The logs are not being shown in 
the UI.

Hello,

I'm deploying Airflow in an EKS cluster using the helm_release Terraform 
resource. Before opening this issue, I created the PV, PVC, and the necessary 
AWS policies to assume the role, but I can't see the logs. Below are my code 
and the errors.

### values.yaml
```extraVolumes:
  - name: s3-volume
    persistentVolumeClaim:
      claimName: airflow-pvc
  - name: pod-template-volume
    configMap:
      name: airflow-pod-template

extraVolumeMounts:
  - name: s3-volume
    mountPath: /opt/airflow/logs
  - name: pod-template-volume
    mountPath: /opt/airflow/pod_templates
    readOnly: true
config:
  kubernetes:
    pod_template_file: /opt/airflow/pod_templates/pod_template.yaml
  logging:
    remote_logging: 'true'
    remote_log_conn_id: 's3_connection'
    remote_base_log_folder: 's3://bucket/logs'
    encrypt_s3_logs: 'false'
    logging_level: 'INFO'
connections:
  s3_connection:
    conn_id: 's3_connection'
    conn_type: 's3'
    extra: |
      {
        "region_name": "${var.aws_region}"
      }
```

### Log of the execution
```
 [2025-07-14T20:03:41.449+0000] {local_task_job_runner.py:243} INFO - Task 
exited with return code 1
 [2025-07-14T20:03:41.484+0000] {taskinstance.py:3503} INFO - 0 downstream 
tasks scheduled from follow-on schedule check
 [2025-07-14T20:03:41.486+0000] {local_task_job_runner.py:222} INFO - 
::endgroup::
 [2025-07-14T20:03:41.559+0000] {.py:84} INFO - Using connection ID 
's3_connection' for task execution.
 [2025-07-14T20:03:41.801+0000] {credentials.py:1075} INFO - Found credentials 
from IAM Role: <role>
 [2025-07-14T20:03:41.972+0000] {s3_task_handler.py:178} ERROR - Could not 
verify previous log to append
 Traceback (most recent call last):
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py",
 line 174, in s3_write
     if append and self.s3_log_exists(remote_log_location):
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py",
 line 141, in s3_log_exists
     return self.hook.check_for_key(remote_log_location)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 135, in wrapper
     return func(*bound_args.args, **bound_args.kwargs)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 89, in wrapper
     return func(*bound_args.args, **bound_args.kwargs)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 922, in check_for_key
     obj = self.head_object(key, bucket_name)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 135, in wrapper
     return func(*bound_args.args, **bound_args.kwargs)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 89, in wrapper
     return func(*bound_args.args, **bound_args.kwargs)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 907, in head_object
     raise e
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 902, in head_object
     return self.get_conn().head_object(Bucket=bucket_name, Key=key)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/home/airflow/.local/lib/python3.11/site-packages/botocore/client.py", 
line 565, in _api_call
     return self._make_api_call(operation_name, kwargs)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/home/airflow/.local/lib/python3.11/site-packages/botocore/client.py", 
line 1021, in _make_api_call
     raise error_class(parsed_response, operation_name)
 botocore.exceptions.ClientError: An error occurred (403) when calling the 
HeadObject operation: Forbidden ()
```

### Logs in UI

<img width="946" height="401" alt="image" 
src="https://github.com/user-attachments/assets/2a3e23c7-eaa4-4e38-9459-fab5e0bcac9b";
 />


Just to comment, we decided to use S3 because of some internal processes, 
that's why we set it up. The main functionality is working, I just want to 
understand how the logs could be stored in this bucket.

GitHub link: https://github.com/apache/airflow/discussions/53354

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to