iscipar opened a new issue, #46357:
URL: https://github.com/apache/airflow/issues/46357

   ### Apache Airflow Provider(s)
   
   google
   
   ### Versions of Apache Airflow Providers
   
   12.0.0
   
   ### Apache Airflow version
   
   2.10.4
   
   ### Operating System
   
   Ubuntu 24.04.1 LTS
   
   ### Deployment
   
   Virtualenv installation
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   When importing any of the existing operators in 
airflow.providers.google.cloud.operators.dataproc (for example 
DataprocCreateClusterOperator, DataprocDeleteClusterOperator or 
DataprocSubmitJobOperator) at the beginning of a dag, a dag import error occurs 
when starting the airflow scheduler.
   
   ### What you think should happen instead
   
   When I start the airflow scheduler the following error occurs:
   
   [2025-02-02T19:12:46.029+0100] {logging_mixin.py:190} INFO - 
[2025-02-02T19:12:46.028+0100] {dagbag.py:387} ERROR - Failed to import: 
/home/iscipar/Projects/airflow-tutorial/dags/6_dataproc_airflow.py
   Traceback (most recent call last):
    File 
"/home/iscipar/Proyectos/airflow-tutorial/airflow_env/lib/python3.12/site-packages/airflow/models/dagbag.py",
 line 383, in parse
    loader.exec_module(new_module)
    File "<frozen importlib._bootstrap_external>", line 995, in exec_module
    File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
    File "/home/iscipar/Proyectos/airflow-tutorial/dags/6_dataproc_airflow.py", 
line 5, in <module>
    from airflow.providers.google.cloud.operators.dataproc import (
    File 
"/home/iscipar/Proyectos/airflow-tutorial/airflow_env/lib/python3.12/site-packages/airflow/providers/google/cloud/operators/dataproc.py",
 line 57, in <module>
    from airflow.providers.google.cloud.openlineage.utils import (
    File 
"/home/iscipar/Proyectos/airflow-tutorial/airflow_env/lib/python3.12/site-packages/airflow/providers/google/cloud/openlineage/utils.py",
 line 204, in <module>
    class BigQueryJobRunFacet(RunFacet):
   TypeError: function() argument 'code' must be code, not str
   
   ### How to reproduce
   
   The error is reproduced simply by adding the following import inside the 
code of a dag without importing the source code of the tasks of said dag:
   
   ```
   from airflow.providers.google.cloud.operators.dataproc import (
       DataprocCreateClusterOperator,
       DataprocDeleteClusterOperator,
       DataprocSubmitJobOperator,
   )
   ```
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to