May I ask why you are using Airflow 1.8.0 which is almost 2 years old?
1.8.0. also suggests to use Celery 3 not 4.1.0 which may be a reason.
I'd suggest to try with latest apache-airflow 1.10.2.

Kind Regards,
Stefan

[1] https://github.com/apache/airflow/blob/1.8.0/setup.py#L108

On 1/25/19 3:50 PM, Priyanka Singh (Reco 2.0) wrote:
> I am trying to setup Airflow as a cluster. Celery executor is unable to
> connect to rabbitmq while executing jobs Here are the configurations:
> 
> Machine 1: webserver and scheduler
> Machine 2: webserver
> Machine 3: worker
> Machine 4: Rabbitmq
> 
> Airflow version: v1.8.0
> Celery version: 4.1.0
> Flower UI version: 0.9.1
> 
> 
> 
> **airflow.cfg**
> airflow_home = ~/airflow
> dags_folder = ~/airflow/dags
> base_log_folder = ~/airflow/logs
> executor = CeleryExecutor
> sql_alchemy_conn = mysql://reco_airflow:[email protected]:3306/airflow
> sql_alchemy_pool_size = 5
> dag_concurrency = 16
> dags_are_paused_at_creation = False
> plugins_folder = ~/airflow/plugins
> # Secret key to save connection passwords in the db
> api_client = airflow.api.client.local_client
> endpoint_url = http://10.34.110.227:8080
> base_url = http://10.34.110.227:8080
> web_server_host = 0.0.0.0
> web_server_port = 8080
> workers = 4
> broker_url = amqp://guest:[email protected]:5672//
> celery_result_backend =
> db+mysql://reco_airflow:[email protected]:3306/airflow
> flower_host = 10.34.110.227
> flower_port = 5555
> default_queue = queue
> 
> Dags are running successfully using celery executor. But I can't see any
> connection with rabbitmq(it is always idle). Also, if I try to use flower,
> the UI opens but keeps on loading and stop responding after ~5 sec. No
> error is coming in logs. Am I missing something in the configuration?
> 
> 
> Here is the Stackoverflow link:
> 
> https://stackoverflow.com/questions/54367536/issues-in-running-airflow-as-cluster-with-celery-executors
> 

Reply via email to