prithvisathiya commented on issue #8212:
URL: https://github.com/apache/airflow/issues/8212#issuecomment-653952342


   Im also experiencing the exact same issue when I upgraded to `1.10.9`, but 
Im still using `LocalExecutor`. I can clearly see from the S3 console that the 
logs are getting uploaded, but the Airflow UI is unable to read it back. It 
attempts to read the local folder and then just gives up with the following 
error:
   
   ```bash
   *** Log file does not exist: 
/app/logs/dag_name/task_name/2020-07-05T19:21:09.715128+00:00/1.log
   *** Fetching from: 
http://airflow-test-scheduler-5f8ccc76df-tc8j8:8793/log/dag_name/task_name/2020-07-05T19:21:09.715128+00:00/1.log
   *** Failed to fetch log file from worker. 
HTTPConnectionPool(host='airflow-test-scheduler-5f8ccc76df-tc8j8', port=8793): 
Max retries exceeded with url: 
/log/dag_name/task_name/2020-07-05T19:21:09.715128+00:00/1.log 
   ```
   
   My configurations below:
   ```
   AIRFLOW__CORE__EXECUTOR=LocalExecutor
   AIRFLOW__CORE__REMOTE_BASE_LOG_FOLDER=s3://{bucket}/logs/
   AIRFLOW__CORE__REMOTE_LOGGING=True
   AIRFLOW__CORE__REMOTE_LOG_CONN_ID=aws_default
   AIRFLOW_CONN_AWS_DEFAULT=aws://?region_name=us-west-2
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to