GitHub user Traviscal added a comment to the discussion: TypeError when 
importing operators from airflow.providers.google.cloud.operators.dataproc

> ### Apache Airflow Provider(s)
> 
> google
> 
> ### Versions of Apache Airflow Providers
> 
> 12.0.0
> 
> ### Apache Airflow version
> 
> 2.10.4
> 
> ### Operating System
> 
> Ubuntu 24.04.1 LTS
> 
> ### Deployment
> 
> Virtualenv installation
> 
> ### Deployment details
> 
> _No response_
> 
> ### What happened
> 
> When importing any of the existing operators in 
> airflow.providers.google.cloud.operators.dataproc (for example 
> DataprocCreateClusterOperator, DataprocDeleteClusterOperator or 
> DataprocSubmitJobOperator) at the beginning of a dag, a dag import error 
> occurs when starting the airflow scheduler.
> 
> ### What you think should happen instead
> 
> When I start the airflow scheduler the following error occurs:
> 
> [2025-02-02T19:12:46.029+0100] {logging_mixin.py:190} INFO - 
> [2025-02-02T19:12:46.028+0100] {dagbag.py:387} ERROR - Failed to import: 
> /home/iscipar/Projects/airflow-tutorial/dags/6_dataproc_airflow.py
> Traceback (most recent call last):
>  File 
> "/home/iscipar/Proyectos/airflow-tutorial/airflow_env/lib/python3.12/site-packages/airflow/models/dagbag.py",
>  line 383, in parse
>  loader.exec_module(new_module)
>  File "<frozen importlib._bootstrap_external>", line 995, in exec_module
>  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
>  File "/home/iscipar/Proyectos/airflow-tutorial/dags/6_dataproc_airflow.py", 
> line 5, in <module>
>  from airflow.providers.google.cloud.operators.dataproc import (
>  File 
> "/home/iscipar/Proyectos/airflow-tutorial/airflow_env/lib/python3.12/site-packages/airflow/providers/google/cloud/operators/dataproc.py",
>  line 57, in <module>
>  from airflow.providers.google.cloud.openlineage.utils import (
>  File 
> "/home/iscipar/Proyectos/airflow-tutorial/airflow_env/lib/python3.12/site-packages/airflow/providers/google/cloud/openlineage/utils.py",
>  line 204, in <module>
>  class BigQueryJobRunFacet(RunFacet):
> TypeError: function() argument 'code' must be code, not str
> 
> ### How to reproduce
> 
> The error is reproduced simply by adding the following import inside the code 
> of a dag without importing the source code of the tasks of said dag:
> 
> ```
> from airflow.providers.google.cloud.operators.dataproc import (
>     DataprocCreateClusterOperator,
>     DataprocDeleteClusterOperator,
>     DataprocSubmitJobOperator,
> )
> ```
> 
> ### Anything else
> 
> _No response_
> 
> ### Are you willing to submit PR?
> 
> - [ ] Yes I am willing to submit a PR!
> 
> ### Code of Conduct
> 
> - [x] I agree to follow this project's [Code of 
> Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
> 

Dm @Traviscal 

GitHub link: 
https://github.com/apache/airflow/discussions/46478#discussioncomment-12130751

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to