GitHub user iscipar added a comment to the discussion: TypeError when importing operators from airflow.providers.google.cloud.operators.dataproc
Hi, I have finally been able to resolve the import issues and all DAGs are running correctly. Below I detail the steps I followed: 1) I deleted my current virtual environment and reinstalled Apache Airflow: pip install apache-airflow 2) I installed the latest version released yesterday of Apache-Airflow-Providers-Google: pip install apache-airflow-providers-google==14.0.0rc1 3) At this point when starting the Airflow scheduler I get the following import errors: File "/home/iscipar/Projects/airflow-tutorial/airflow_env/lib/python3.12/site-packages/airflow/providers/google/cloud/transfers/s3_to_gcs.py", line 27, in <module> from airflow.providers.amazon.aws.hooks.s3 import S3Hook ModuleNotFoundError: No module named 'airflow.providers.amazon' File "/home/iscipar/Proyectos/airflow-tutorial/airflow_env/lib/python3.12/site-packages/airflow/providers/google/cloud/transfers/azure_blob_to_gcs.py", line 28, in <module> from airflow.providers.microsoft.azure.hooks.wasb import WasbHook ModuleNotFoundError: No module named 'airflow.providers.microsoft' 4) After analyzing the exception I have understood that it is necessary to install the amazon and microsoft-azure providers. In this way I have installed the versions of both providers released yesterday as well: pip install apache-airflow-providers-amazon==9.4.0rc1 pip install apache-airflow-providers-microsoft-azure==12.2.0rc1 After these package installations, as I said before, I have already managed to get the DAGs working. In principle, I see that with these RC versions of the providers they work, although in the future I will logically update to the final versions when they are released. Regards, GitHub link: https://github.com/apache/airflow/discussions/46478#discussioncomment-12285209 ---- This is an automatically sent email for [email protected]. To unsubscribe, please send an email to: [email protected]
