Just 2 cent summary:

You have 4 main avenues to execute a pipeline regardless of what you use
to orchestrate .

You can execute a pipeline in the following ways:

   - on a server (read computer) by executing a .sh on the cli
   - on a hop server by sending an API call
   - on a docker container local or on the cloud which can be as simple as
   either of the following:
      - the base hop image + a mounted drive for I/O and repo
      - your image + a mounted drive for I/O where your image is made of
      the hop base image + a git pull of your repo
   - on a managed serverless service such as Google's Dataflow


The last 2 are particularly interesting when operating on a cloud as you
pay for what you use rather than having a server ON 24x7

Airflow will particularly help with the last 2 although it can work with
either.





On Tue, 31 Dec 2024 at 22:21, <[email protected]> wrote:

> Thank you - I'll take a look at it.
> I already know that I will throw away all this 'Docker' baggage – I hate
> this software.
>
>
> *Sent:* Monday, December 30, 2024 at 4:41 PM
> *From:* "Hans Van Akelyen" <[email protected]>
> *To:* [email protected]
> *Subject:* Re: Do you have any experience running Hop pipelines in Apache
> Airflow?
> There is a how-to guide on the website [1]
>
>
> Cheers,
> Hans
>
> [1]
>
> https://hop.apache.org//manual/latest/how-to-guides/run-hop-in-apache-airflow.html#_what_is_apache_airflow
>
> On Mon, 30 Dec 2024 at 16:20, Ganesh T <[email protected]> wrote:
>
>> Hi,
>>
>> Yeah. We are taking care of ETLs running hop workflows.
>>
>> Regards
>> Ganesh
>>
>> On Mon, 30 Dec 2024 at 8:16 PM, <[email protected]> wrote:
>>
>>>
>>> Hi,
>>>
>>> Do you have any experience running Hop pipelines in Apache Airflow?
>>> Specifically, I mean managing logging, reporting execution status, etc.
>>>
>>> Best,
>>>
>>> P
>>>
>>
>
>

Reply via email to