aioannoa opened a new issue #17605: URL: https://github.com/apache/airflow/issues/17605
**Apache Airflow version**: Airflow 2.0.0 **Apache Airflow Provider versions**: apache-airflow-providers-ftp==1.0.0 apache-airflow-providers-http==1.0.0 apache-airflow-providers-imap==1.0.0 apache-airflow-providers-postgres==1.0.1 apache-airflow-providers-sqlite==1.0.0 **Kubernetes version (if you are using kubernetes)**: Client Version: version.Info{Major:"1", Minor:"15", GitVersion:"v1.15.0", GitCommit:"e8462b5b5dc2584fdcd18e6bcfe9f1e4d970a529", GitTreeState:"clean", BuildDate:"2019-06-19T16:40:16Z", GoVersion:"go1.12.5", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"15", GitVersion:"v1.15.12", GitCommit:"e2a822d9f3c2fdb5c9bfbe64313cf9f657f0a725", GitTreeState:"clean", BuildDate:"2020-05-06T05:09:48Z", GoVersion:"go1.12.17", Compiler:"gc", Platform:"linux/amd64"} **Environment**: - **Cloud provider or hardware configuration**: AWS Cloud - EC2 instance - **OS** (e.g. from /etc/os-release): NAME="Ubuntu" VERSION="16.04.3 LTS (Xenial Xerus)" ID=ubuntu ID_LIKE=debian PRETTY_NAME="Ubuntu 16.04.3 LTS" VERSION_ID="16.04" HOME_URL="http://www.ubuntu.com/" SUPPORT_URL="http://help.ubuntu.com/" BUG_REPORT_URL="http://bugs.launchpad.net/ubuntu/" VERSION_CODENAME=xenial UBUNTU_CODENAME=xenial - **Kernel** (e.g. `uname -a`): Linux ip-172-25-1-109 4.4.0-1128-aws #142-Ubuntu SMP Fri Apr 16 12:42:33 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux - **Install tools**: - **Others**: **What happened**: I have been trying to get Airflow logs to be printed to stdout by: 1. creating a new python script, shown below, named log_config.py, under the directory config as instructed here: https://airflow.apache.org/docs/apache-airflow/stable/logging-monitoring/logging-tasks.html from copy import deepcopy from airflow.config_templates.airflow_local_settings import DEFAULT_LOGGING_CONFIG import sys LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG) LOGGING_CONFIG["handlers"]["stdouttask"] = { "class": "logging.StreamHandler", "formatter": "airflow", "stream": sys.stdout, } LOGGING_CONFIG["loggers"]["airflow.task"]["handlers"] = ["stdouttask"] 2. setting logging_config_class = log_config.LOGGING_CONFIG, and task_log_reader = stdouttask in the airflow.cfg file. After checking the logs, the logs are still printed into a file, so this has not worked out for me. After searching online I noticed that people suggest using the console handler instead. However, when using this the pod becomes unresponsive, and this affects other pods as well as the system. ssh won't work anymore for some time, I guess while the dag runs, and other pods will be unresponsive or even down, e.g. the database of the cluster. I have read that there may be some memory leak issues with this. Has anyone been able to verify whether this is the case and under which circumstances this causes a problem? I have not been able to find a clear answer, neither online or in the Airflow documentation. **What you expected to happen**: I was expecting all Airflow logs to be printed to stdout alone. <!-- What do you think went wrong? --> For the 1st part, no idea. For the 2nd part, memory leak or cpu exhaustion. **How to reproduce it**: For the 1st part, try the abive as is. For the 2nd part, try: LOGGING_CONFIG["loggers"]["airflow.task"]["handlers"] = ["console", "stdouttask"] **Anything else we need to know**: How often does this problem occur? Once? Every time etc? Every time. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org