michaelkhan3 commented on issue #15279:
URL: https://github.com/apache/airflow/issues/15279#issuecomment-937418131


   > i've gotten around it by creating a `log_config.py` with the following:
   > 
   > ```
   > from copy import deepcopy
   > import logging
   > 
   > from airflow.config_templates.airflow_local_settings import 
DEFAULT_LOGGING_CONFIG
   > 
   > LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG)
   > class EmptyLogFilter(logging.Filter):
   >     def filter(self, record: logging.LogRecord) -> bool:
   >         if not record.getMessage():
   >             return False
   >         else:
   >             return True
   > 
   > 
   > LOGGING_CONFIG['filters']['remove_blanks'] = {'()': 
'log_config.EmptyLogFilter'}
   > 
   > CLOUDWATCH_HANDLER = 
'airflow.providers.amazon.aws.log.cloudwatch_task_handler.CloudwatchTaskHandler'
   > 
   > TASK_HANDLER = LOGGING_CONFIG['handlers']['task']
   > if TASK_HANDLER['class'] == CLOUDWATCH_HANDLER:
   >     # This thing blows up on empty log messages so apply filter
   >     filters = TASK_HANDLER.get('filters', [])
   >     filters.append('remove_blanks')
   >     TASK_HANDLER['filters'] = filters
   > ```
   > 
   > don't forget to enable this logger in `airflow.cfg` with 
`logging_config_class = log_config.LOGGING_CONFIG`
   
   Where did you add the log_config.py file? In the airflow docker image or 
with the dags?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to