Thanks for the explanation, really helpful.
Cheers,
Ali
On 2018/05/16 03:27:27, Ruiqin Yang wrote:
> You are right, but that's within the same process. The way each operator
> gets executed is that one `airflow run` command get generated and sent to
> the local executor, executor spun up subpro
You are right, but that's within the same process. The way each operator
gets executed is that one `airflow run` command get generated and sent to
the local executor, executor spun up subprocesses to run `airflow run
--raw` (which parses the file again and calls the operator.execute()). Thus
each t
Thanks Kevin. Yes, I'm importing db in different operators. That said, my
understanding is if a module has already been imported, it's not loaded again
even if you try to import it again (and I reckon this is why in Python
Singleton is not commonly used). Is that right?
On 2018/05/16 02:34:18,
发件人: alireza.khoshkbari@
发送时间: 2018年5月16日 1:21
收件人: d...@airflow.apache.org
主题: How Airflow import modules as it executes the tasks
To start off, here is my project structure:
├── dags
│ ├── __init__.py
│ ├── core
│ │ ├── __init__.py
│ │ ├── operators
│ │ │ ├── __init__.py
Not exactly answering your question but the reason db.py is loaded in each
task might be because you have something like `import db` in each of your
*.py file, and Airflow spun up one process to parse one *.py file, thus
your db.py was loaded multiple time.
I'm not sure how you can share the conne
To start off, here is my project structure:
├── dags
│ ├── __init__.py
│ ├── core
│ │ ├── __init__.py
│ │ ├── operators
│ │ │ ├── __init__.py
│ │ │ ├── first_operator.py
│ │ └── util
│ │ ├── __init__.py
│ │ ├── db.py
│ ├── my_dag.py
Here is the version