Don't you need to preserve the task objects? Your implementation overwrites
each by the successor, so only the last task would be kept, despite your
print statements. Try building a list or dict of tasks like:
tasks =[] #only at the top
for file in glob('dags/snowsql/create/udf/*.sql'):
print("FIL
Ok, my mistake. I thought that command was querying the server for its
information and not just looking in a directory relative to where it is being
run. I have it working now. Thanks Chris and Sai!
On 9/14/18, 9:58 AM, "Chris Palmer" wrote:
The relative paths might work from where ever
The relative paths might work from where ever you are evoking 'airflow
list_tasks', but that doesn't mean they work from wherever the webserver is
parsing the dags from.
Does running 'airflow list_tasks' from some other running directory work?
On Fri, Sep 14, 2018 at 12:35 PM Frank Maritato
wrot
Do you mean give the full path to the files? The relative path I'm using
definitely works. When I type airflow list_dags, I can see the output from the
print statements that the glob is finding my sql files and creating the
snowflake operators.
airflow list_tasks workflow also lists all the ope