I am using 1.10.6 and here are my log configurations for running airflow on
kubernetes. I set up the kubenets to send all the console output logs to
elasticsearch and I am trying to configure airflow worker to write logs to
console. And it does not seem to work. I can see the local logs in the pod, but
the task instance logs are not getting written to console therefore my filebeat
daemon set cannot pick up the logs. Can you please help to shed lights to this?
airflow:
config:
AIRFLOW__CORE__REMOTE_LOGGING: "True"
# HTTP_PROXY: "http://proxy.mycompany.com:123
<https://slack-redir.net/link?url=http%3A%2F%2Fproxy.mycompany.com%3A123>"
AIRFLOW__ELASTICSEARCH__LOG_ID_TEMPLATE:
"{{dag_id}}-{{task_id}}-{{execution_date}}-{{try_number}}"
AIRFLOW__ELASTICSEARCH__END_OF_LOG_MARK: "end_of_log"
AIRFLOW__ELASTICSEARCH__WRITE_STDOUT: "True"
AIRFLOW__ELASTICSEARCH__JSON_FORMAT: "True"
AIRFLOW__ELASTICSEARCH__JSON_FIELDS: "asctime, filename, lineno, levelname,
message"
smime.p7s
Description: S/MIME cryptographic signature
