I am trying to setup Airflow as a cluster. Celery executor is unable to
connect to rabbitmq while executing jobs Here are the configurations:

Machine 1: webserver and scheduler
Machine 2: webserver
Machine 3: worker
Machine 4: Rabbitmq

Airflow version: v1.8.0
Celery version: 4.1.0
Flower UI version: 0.9.1



**airflow.cfg**
airflow_home = ~/airflow
dags_folder = ~/airflow/dags
base_log_folder = ~/airflow/logs
executor = CeleryExecutor
sql_alchemy_conn = mysql://reco_airflow:password@10.32.170.111:3306/airflow
sql_alchemy_pool_size = 5
dag_concurrency = 16
dags_are_paused_at_creation = False
plugins_folder = ~/airflow/plugins
# Secret key to save connection passwords in the db
api_client = airflow.api.client.local_client
endpoint_url = http://10.34.110.227:8080
base_url = http://10.34.110.227:8080
web_server_host = 0.0.0.0
web_server_port = 8080
workers = 4
broker_url = amqp://guest:guest@10.34.94.212:5672//
celery_result_backend =
db+mysql://reco_airflow:password@10.32.170.109:3306/airflow
flower_host = 10.34.110.227
flower_port = 5555
default_queue = queue

Dags are running successfully using celery executor. But I can't see any
connection with rabbitmq(it is always idle). Also, if I try to use flower,
the UI opens but keeps on loading and stop responding after ~5 sec. No
error is coming in logs. Am I missing something in the configuration?


Here is the Stackoverflow link:

https://stackoverflow.com/questions/54367536/issues-in-running-airflow-as-cluster-with-celery-executors

Reply via email to