mavenzer opened a new issue, #28599:
URL: https://github.com/apache/airflow/issues/28599

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### What happened
   
   So We have deployed Airflow 2.2.4 custom docker image but the base image is 
from Airflow just added few python packages for the tailored use cases.  We 
have deployed the Airflow 2.2.4 in Openshift  v4.
   
   The components in our Airflow Architecture :
   PostgresDb as Meta-db
   Redis as a Message Broker
   Webserver 
   
   Python Version 3.9
   
   
   
   While executing the command ` airflow celery flower`  I'm getting this 
following error: 
   ```
   (app-root) (app-root) sh-4.4$ airflow celery flower
   [2022-12-26 13:35:37,902] {command.py:152} INFO - Visit me at 
http://0.0.0.0:5555
   [2022-12-26 13:35:37,906] {command.py:159} INFO - Broker: 
redis://:**@10.121.129.57:6379/0
   [2022-12-26 13:35:37,908] {command.py:160} INFO - Registered tasks: 
   ['airflow.executors.celery_executor.execute_command',
    'celery.accumulate',
    'celery.backend_cleanup',
    'celery.chain',
    'celery.chord',
    'celery.chord_unlock',
    'celery.chunks',
    'celery.group',
    'celery.map',
    'celery.starmap']
   Traceback (most recent call last):
     File "/opt/app-root/bin/airflow", line 8, in <module>
       sys.exit(main())
     File "/opt/app-root/lib64/python3.9/site-packages/airflow/__main__.py", 
line 48, in main
       args.func(args)
     File 
"/opt/app-root/lib64/python3.9/site-packages/airflow/cli/cli_parser.py", line 
48, in command
       return func(*args, **kwargs)
     File "/opt/app-root/lib64/python3.9/site-packages/airflow/utils/cli.py", 
line 92, in wrapper
       return f(*args, **kwargs)
     File 
"/opt/app-root/lib64/python3.9/site-packages/airflow/cli/commands/celery_command.py",
 line 79, in flower
       celery_app.start(options)
     File "/opt/app-root/lib64/python3.9/site-packages/celery/app/base.py", 
line 371, in start
       celery.main(args=argv, standalone_mode=False)
     File "/opt/app-root/lib64/python3.9/site-packages/click/core.py", line 
1053, in main
       rv = self.invoke(ctx)
     File "/opt/app-root/lib64/python3.9/site-packages/click/core.py", line 
1659, in invoke
       return _process_result(sub_ctx.command.invoke(sub_ctx))
     File "/opt/app-root/lib64/python3.9/site-packages/click/core.py", line 
1395, in invoke
       return ctx.invoke(self.callback, **ctx.params)
     File "/opt/app-root/lib64/python3.9/site-packages/click/core.py", line 
754, in invoke
       return __callback(*args, **kwargs)
     File "/opt/app-root/lib64/python3.9/site-packages/click/decorators.py", 
line 26, in new_func
       return f(get_current_context(), *args, **kwargs)
     File "/opt/app-root/lib64/python3.9/site-packages/flower/command.py", line 
53, in flower
       flower.start()
     File "/opt/app-root/lib64/python3.9/site-packages/flower/app.py", line 77, 
in start
       self.listen(self.options.port, address=self.options.address,
     File "/opt/app-root/lib64/python3.9/site-packages/tornado/web.py", line 
2109, in listen
       server.listen(port, address)
     File "/opt/app-root/lib64/python3.9/site-packages/tornado/tcpserver.py", 
line 151, in listen
       sockets = bind_sockets(port, address=address)
     File "/opt/app-root/lib64/python3.9/site-packages/tornado/netutil.py", 
line 161, in bind_sockets
       sock.bind(sockaddr)
   OSError: [Errno 98] Address already in use
   [2022-12-26 13:35:37,912] {mixins.py:225} INFO - Connected to 
redis://:**@10.121.129.57:6379/0
   ```
   
   What's my understanding that the connection string for the redis backend is 
already being used. I have tried to kill the Redis & Flower deployment and 
redeploy it once again. But the same issue persists.
   
   I have also tried to kill the process ID of the following process using 
`KILL $PID`
   
   Also the route for the Flower is also inaccessible like I couldn;t log into 
it. 
   
   The confrigration of Airflow.cfg file is as follows:
   ```
   flower_host = 0.0.0.0
   
   # The root URL for Flower
   # Example: flower_url_prefix = /flower
   flower_url_prefix = /flower
   
   # This defines the port that Celery Flower runs on
   flower_port = 5555
   
   # Securing Flower with Basic Authentication
   # Accepts user:password pairs separated by a comma
   # Example: flower_basic_auth = user1:password1,user2:password2
   flower_basic_auth = username:password
   ```
   
   ### What you think should happen instead
   
   _No response_
   
   ### How to reproduce
   
   _No response_
   
   ### Operating System
   
   rhel-linux
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Other
   
   ### Deployment details
   
   The Deployment Config of Flower is as follows:
   
   ```
   apiVersion: apps.openshift.io/v1
   kind: DeploymentConfig
   metadata:
     name: airflow-flower
     namespace: CUSTOM_NAMESPACE
     labels:
       app: airflow
   spec:
     strategy:
       type: Rolling
     triggers:
       - type: ConfigChange
       - type: ImageChange
         imageChangeParams:
           automatic: true
           containerNames:
             - airflow-flower
           from:
             kind: ImageStreamTag
             namespace: CUSTOM_NAMESPACE
             name: "airflow-test:latest"
     replicas: 1
     revisionHistoryLimit: 10
     paused: false
     selector:
       app: airflow
       deploymentconfig: airflow-flower
     template:
       metadata:
         labels:
           name: airflow-flower
           app: airflow
           deploymentconfig: airflow-flower
       spec:
         volumes:
           - name: airflow-dags
             persistentVolumeClaim:
               claimName: airflow-dags
           - name: airflow-logs
             persistentVolumeClaim:
               claimName: airflow-logs
             
         containers:
           - name: airflow-flower
             image: airflow-test
             resources:
               limits:
                 memory: 512Mi
             env:
               - name : AIRFLOW__CORE__SQL_ALCHEMY_CONN
                 value: 
postgresql+psycopg2://airflow:PASSWORD@airflow-db/airflow
               - name : AIRFLOW__CELERY__RESULT_BACKEND
                 value:  db+postgresql://airflow:PASSWORD@airflow-db/airflow
               - name: AIRFLOW__CORE__EXECUTOR
                 value: CeleryExecutor
               - name : AIRFLOW__CELERY__BROKER_URL
                 value: redis://:PASSWORD@HOSTNAME:6379/0
               - name : FLOWER_BASIC_AUTH
                 value :  username:password
               - name: REDIS_PASSWORD
                 valueFrom:
                   secretKeyRef:
                     name: airbus-redis
                     key: database-password 
               - name: POSTGRESQL_USER
                 valueFrom:
                   secretKeyRef:
                     name: airflow-db
                     key: database-user
               - name: POSTGRESQL_PASSWORD
                 valueFrom:
                   secretKeyRef:
                     name: airflow-db
                     key: database-password
               - name: POSTGRESQL_ROOT_PASSWORD
                 valueFrom:
                   secretKeyRef:
                     name: airflow-db
                     key: database-root-password
               - name: POSTGRESQL_DATABASE 
                 valueFrom:
                   secretKeyRef:
                     name: airflow-db
                     key: database-name
                   
             ports:
               - containerPort: 5555
                 protocol: TCP
   ```
   
   The Deployment Config of Redis is as follow : 
   ```
   apiVersion: apps.openshift.io/v1
   kind: DeploymentConfig
   metadata:
     name: airflow-redis
     namespace: CUSTOM_NAMESPACE
     label:
       app: airflow
   spec:
     strategy:
       type: Rolling
     triggers:
       - type: ConfigChange
     replicas: 1
     revisionHistoryLimit: 10
     paused: false 
     selector:
       app: airflow
       deploymentconfig: airflow-redis
     template:
       metadata:
         labels:
           name: airflow-redis
           app:  airflow
           deploymentconfig:  airflow-redis
       spec: 
         volumes:
           - name:  airflow-redis
             persistentVolumeClaim:
               claimName: airflow-redis
         containers:
           - name:  airflow-redis
             image: redis-6-rhel8
             resources:
               limits:
                 memory: 512Mi
             env:
               - name: REDIS_PASSWORD
                 valueFrom:
                   secretKeyRef:
                     name: airbus-redis
                     key: database-password
             ports:
               - containerPort: 6379
                 protocol: TCP
             volumeMounts:
               - name: airflow-redis
                 mountPath: /var/lib/redis/data
               
   
    
   ```
   
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to