seanghaeli commented on issue #57356:
URL: https://github.com/apache/airflow/issues/57356#issuecomment-3792261086

   @ashb I repro'd it by enabling CloudWatch logging in `airflow.cfg` according 
to [these 
docs](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/logging/cloud-watch-task-handlers.html).
 Then, make sure AWS credentials are set up, I did it by creating a connection 
`AIRFLOW_CONN_AWS_DEFAULT='aws://'`. Then start Airflow (I did it using breeze) 
and run any dag, such as:
   
   ```
   from airflow import DAG
   from airflow.operators.python import PythonOperator
   from datetime import datetime
   
   with DAG(dag_id="test_cloudwatch", start_date=datetime(2020, 1, 1)) as dag:
       def print_logs():
           print("Hello from task!")
       
       task = PythonOperator(task_id="test_task", python_callable=print_logs)
   ```
   
   On the UI for the task run, for me the logs show an error message associated 
with CloudWatch remote logging.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to