GitHub user marcopistacchio added a comment to the discussion: Hop Web | 
Airflow HTTP operator

Thank you, @bamaer 

> The current Hop Server is stateless. It can't read pipelines from disk, so 
> you'll have to register all pipelines and workflow before you can execute 
> them there, and you'll have to repeat the process every time Hop Server is 
> restarted. You wouldn't have that issue with the DockerOperator

Understood, thank you! Without hot reloading in the Hop Server, this becomes 
more complex, as we will need to manage restarts in the container to ensure 
newly created or changed Hop Pipelines are available with the Hop Server.

> Another option could be to build your workflows and pipelines in Hop Web and 
> integrate them in a gitops process that builds fully self-contained container 
> images that you can execute from Airflow. This would eliminate the complexity 
> of mounting volumes to share your Hop projects, but it would introduce new 
> complexity on the gitops side.

Yes, we can build a custom Docker image that contains the Hop Project files and 
rebuild it whenever a workflow is changed/added, but, as you mentioned, it 
would require GitOps work.

I am leaning toward Persistent Disk volume with the DockerOperator and Hop Run.

Thanks again for the feedback 👍 


GitHub link: 
https://github.com/apache/hop/discussions/6095#discussioncomment-15116251

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to