GitHub user ClaudioCurzi created a discussion: How do you trigger Hop workflows 
from Airflow? (Timeout/Stop issues with Hop Server)

Hi everyone,
I’m integrating Apache Airflow with Apache Hop 2.15, and I’m facing a 
limitation when executing workflows through Hop Server.

**Issue**
I trigger Hop workflows from Airflow using the **SSHOperator**.
The problem is that **Airflow cannot stop or kill the workflow** on Hop Server 
if a timeout occurs.
- If the workflow hangs or exceeds Airflow’s timeout, it keeps running on Hop 
Server.
- This creates zombie processes and increases memory usage until the machine 
must be rebooted.

Because of this behavior, using Hop Server in production becomes difficult to 
maintain.

**Questions**

1. **How do you trigger Apache Hop workflows from Airflow?**
   - Do you use **Hop Server** despite the lack of stop/kill support?
   - Or do you trigger workflows using **hop-run.sh (Hop Run) in local mode** 
directly from Airflow?

2. If you use **Hop Run locally**, does Airflow correctly handle:
   - task timeouts,
   - process termination,
   - killing the workflow when needed?

3. Do you have any **best practices** for a robust Airflow → Hop integration, 
especially regarding:
   - stopping workflows,
   - handling long-running jobs,
   - preventing zombie processes?

Thanks in advance for any insights or real-world experience!

GitHub link: https://github.com/apache/hop/discussions/6046

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to