Hi Emad,

1.9.5 isn't a valid Airflow version.....? It went from 1.9.0 to 1.10.10

What's in /var/log/airflow/worker-logs.log and 
/var/log/airflow/worker-errors.log?

What are in your script/airflow-*.sh files?
Thanks,
-ash
On Jul 13 2020, at 12:41 pm, Emad Mokhtar <[email protected]> wrote:
> Overview
> I updated Apache Airflow from 1.9.5 to 1.10.10. I'm managing the Airflow 
> processes/components via supervisor. I have an ansible-playbook that restarts 
> these processes every time I deploy changes to the server.
> The problem
> This script was working fine with 1.9.5, but after the upgrade every time I 
> upgrade the Airflow processes, it seems like lost the state of the tasks and 
> DAGs. These on of the failure emails I will get after the restart
> Exception:
> Executor reports task instance finished (failed) although the task says its 
> queued. Was the task killed externally?
> Configurations
> supervisor conf
> [program:airflow-webserver]
> command = /home/airflow/airflow-dags/script/airflow-webserver.sh
> directory = /home/airflow/
> environment=HOME="/home/airflow",USER="airflow",PATH="{{ airflow_python_path 
> }}:{{ airflow_venv_path }}/bin:%(ENV_PATH)s"
> user = airflow
> stdout_logfile = /var/log/airflow/webserver-logs.log
> stderr_logfile = /var/log/airflow/webserver-errors.log
> stdout_logfile_backups = 0
> redirect_stderr = true
> autostart = true
> autorestart = true
> startretries = 3
> stopsignal=QUIT
> stopasgroup=true
>
> [program:airflow-scheduler]
> command = /home/airflow/airflow-dags/script/airflow-scheduler.sh
> directory = /home/airflow/
> environment=HOME="/home/airflow",USER="airflow",PATH="{{ airflow_python_path 
> }}:{{ airflow_venv_path }}/bin:%(ENV_PATH)s"
> user = airflow
> stdout_logfile = /var/log/airflow/scheduler-logs.log
> stderr_logfile = /var/log/airflow/scheduler-errors.log
> stdout_logfile_backups = 0
> redirect_stderr = true
> autostart = true
> autorestart = true
> startretries = 3
> stopsignal=QUIT
> stopasgroup=true
> killasgroup=true
>
> [program:airflow-worker]
> command = /home/airflow/airflow-dags/script/airflow-worker.sh
> directory = /home/airflow/
> environment=HOME="/home/airflow",USER="airflow",PATH="{{ airflow_python_path 
> }}:{{ airflow_venv_path }}/bin:%(ENV_PATH)s"
> user = airflow
> stdout_logfile = /var/log/airflow/worker-logs.log
> stderr_logfile = /var/log/airflow/worker-errors.log
> stdout_logfile_backups = 0
> redirect_stderr = true
> autostart = true
> autorestart = true
> startretries = 3
> stopsignal=QUIT
> stopasgroup=true
> killasgroup=true
>
> Originally posted on StackOverflow 
> (https://stackoverflow.com/questions/62873210/restart-apache-airflow-managed-by-supervisor)
> --
> Thanks and best regards,
> Emad Mokhtar
> MobPro, mobile first media agency
>
> We work closely together with 24AM (https://its24am.com/), the mobile first 
> media agency.
> 020-7028200
> LinkedIn (https://www.linkedin.com/in/emadmokhtar/), www.mobpro.com 
> (http://www.mobpro.com/)
>
>
>

Reply via email to