Hi,
I'm wondering why the conf parameter is not available in the DagRun UI. I
believe conf param can be used to pass values dynamically to a DagRun. And
IMO it makes sense to have that options in the DagRun UI.
WDYT?
Best
Hi,
I am checking the DagRunOperator [1] and I was using the
example_trigger_controller_dag.py & example_trigger_target_dag.py.
When I triggered the controller dag, the target dag tasks are not
triggered.
Check dags.png
while investigating further (check dagruns.png), I believe this due to dag
I think it's going to be an antipattern to write Python configuration in
Airflow to configure a Kubernetes deployment since even a "simple"
deployment would likely be more classes/objects than the DAG itself. I do
like the idea of having a more featured operator than the PodOperator, but
if I were
Airflow logs are stored on the worker filesystem. When a worker starts, it
runs a subprocess that serves logs via Flask:
https://github.com/apache/incubator-airflow/blob/master/airflow/bin/cli.py#L985
If you use the remote logging feature, the logs are (instead? also?) stored
in S3.
Postgres