[ 
https://issues.apache.org/jira/browse/AIRFLOW-3853?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16769850#comment-16769850
 ] 

ASF GitHub Bot commented on AIRFLOW-3853:
-----------------------------------------

samuelwbock commented on pull request #4675: [AIRFLOW-3853] Default to delete 
local logs after remote upload
URL: https://github.com/apache/airflow/pull/4675
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [X] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3853) issues and references 
them in the PR title. For example, "\[AIRFLOW-3853\] My Airflow PR"
     - https://issues.apache.org/jira/browse/AIRFLOW-3853
   
   ### Description
   
   - [X] We've recently started to see duplicate logs in S3. After digging into 
it, we discovered that this was due to our use of the new `reschedule` mode on 
our sensors. Because the same `try_number` is used when a task reschedules, the 
local log file frequently contains results from previous attempts. 
Additionally, because the `s3_task_helper.py` always tries to `append` the 
local log file to the remove log file, this can result in massive logs (we 
found one that 400 mb).
   
   To fix this, we'd like to remove the local log after a successful upload. 
Because the file is uploaded to S3, no data will be lost.
   
   ### Tests
   
   - [X] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason: I've modified the following unit tests to cover the 
change to `s3_write`: `test_write`, `test_write_existing`, `test_write_raises`.
   
   ### Commits
   
   - [X] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
     1. Subject is separated from body by a blank line
     1. Subject is limited to 50 characters (not including Jira issue reference)
     1. Subject does not end with a period
     1. Subject uses the imperative mood ("add", not "adding")
     1. Body wraps at 72 characters
     1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [X] In case of new functionality, my PR adds documentation that describes 
how to use it.
     - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
     - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [X] Passes `flake8`
   
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Duplicate Logs appearing in S3
> ------------------------------
>
>                 Key: AIRFLOW-3853
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-3853
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: logging
>    Affects Versions: 1.10.2
>            Reporter: Sam Bock
>            Assignee: Sam Bock
>            Priority: Major
>
> We've recently started to see duplicate logs in S3. After digging into it, we 
> discovered that this was due to our use of the new `reschedule` mode on our 
> sensors. Because the same `try_number` is used when a task reschedules, the 
> local log file frequently contains results from previous attempts. 
> Additionally, because the `s3_task_helper.py` always tries to `append` the 
> local log file to the remove log file, this can result in massive logs (we 
> found one that 400 mb).
> To fix this, we'd like to remove the local log after a successful upload.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to