On 2018/09/16 05:53:02, Bhavani Ramasamy <vsr.bhav...@gmail.com> wrote: 
> Hello Team,
> I am trying to setup S3 logging with docker & CeleryExecutor. Files are not
> written to S3. I have configured in airflow.cfg like below,
> 
> remote_logging = True
> 
> remote_log_conn_id = s3_connection_mine
> 
> remote_base_log_folder = s3:// mybucket/airflow/logs/
> 
> 
> I have tried with *logging_config_class* as empty as well as custom
> log_config.py using airflow_local_settings.py file. It also doesnt work.
> Can you please help me.
> 
> 
> Thanks,
> 
> Bhavani
> Hello Kyle,
I am not using docker operator. I am trying to run apache-airflow 1.10 inside a 
docker container. When i use LocalExecutor, i am able to write logs to S3 since 
all the airflow webserver & scheduler resides inside one docker container. But 
when i try to use CeleryExecutor, the logs are not getting written to S3. Is 
there any special configuration needed when using CeleryExecutor & S3 logs?
Thanks,
Bhavani

Reply via email to