[ 
https://issues.apache.org/jira/browse/AIRFLOW-1539?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16147058#comment-16147058
 ] 

Matthias Huschle commented on AIRFLOW-1539:
-------------------------------------------

How long does this take? The timeout kicks in during "{{if ismodule(module) and 
hasattr(module, '__file__')}}", which doesn't look suspicious to me. I think 
the default timeout for DAG loading is 30 seconds. Are you sure you're not just 
running into that? No idea why python2 could avoid that, though. Is the error 
exactly the same each time, even if you vary your configuration for 
DAGBAG_IMPORT_TIMEOUT?

> python3 error import Dag
> ------------------------
>
>                 Key: AIRFLOW-1539
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-1539
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: DAG
>    Affects Versions: Airflow 1.8
>         Environment: docker debian linux 9, conda python3
>            Reporter: Lulu Cheng
>            Priority: Blocker
>
> Seeing the following error when using python3 (fine with python2)
> {code}
> Traceback (most recent call last):
>   File "/opt/conda/lib/python3.6/site-packages/airflow/models.py", line 263, 
> in process_file
>     m = imp.load_source(mod_name, filepath)
>   File "/opt/conda/lib/python3.6/imp.py", line 172, in load_source
>     module = _load(spec)
>   File "<frozen importlib._bootstrap>", line 675, in _load
>   File "<frozen importlib._bootstrap>", line 655, in _load_unlocked
>   File "<frozen importlib._bootstrap_external>", line 678, in exec_module
>   File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
>   File "/root/airflow/dags/LoadS3ToHiveTest.py", line 2, in <module>
>     from LoadS3ToHive import default_args, 
> download_s3_to_hdfs_templated_command, generate_download_s3_to_hdfs_task, \
>   File "/root/airflow/dags/LoadS3ToHive.py", line 74, in <module>
>     globals()[task_name] = DAG(task_name, default_args=default_args, 
> schedule_interval=None)
>   File "/opt/conda/lib/python3.6/site-packages/airflow/models.py", line 2664, 
> in __init__
>     self.fileloc = inspect.getsourcefile(inspect.stack()[1][0])
>   File "/opt/conda/lib/python3.6/inspect.py", line 1465, in stack
>     return getouterframes(sys._getframe(1), context)
>   File "/opt/conda/lib/python3.6/inspect.py", line 1442, in getouterframes
>     frameinfo = (frame,) + getframeinfo(frame, context)
>   File "/opt/conda/lib/python3.6/inspect.py", line 1411, in getframeinfo
>     filename = getsourcefile(frame) or getfile(frame)
>   File "/opt/conda/lib/python3.6/inspect.py", line 666, in getsourcefile
>     if getattr(getmodule(object, filename), '__loader__', None) is not None:
>   File "/opt/conda/lib/python3.6/inspect.py", line 703, in getmodule
>     if ismodule(module) and hasattr(module, '__file__'):
>   File "/opt/conda/lib/python3.6/site-packages/airflow/utils/timeout.py", 
> line 38, in handle_timeout
>     raise AirflowTaskTimeout(self.error_message)
> airflow.exceptions.AirflowTaskTimeout: Timeout
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to