> Absolutely. We wrote a custom AWS S3 XCom backend to do exactly that.

Well if you have it all working then what are we yabbering about :)

I think custom XCom requires that your container is running airflow -- but
you are using KPO so I don't know if that's true?  Either that or it forces
you to pull the data in via templating but if it's large data that's back
to your orig problem.  Did your custom backend work for you?  If so, what's
the problem you're facing?  If it works, then great and no reason to second
guess it I'd think.

You certainly don't _need_ to use custom xcom.  And I am not sure you
really need KPO?  Maybe you need an ancient client to work with your
ancient ES cluster?  And modern client for most of your ES jobs?

But anyway if I were doing this, my first thought would be not to bother
with xcom (though I "grew up" in a pre-taskflow API world).

my_data_path = "s3://bucket/key.txt"
op1 = GetDataToS3(task_id='to_s3', path=my_data_path)
op2 = LoadToES(task_id='to_es', path=my_data_path)
op1 >> op3

Reply via email to