potiuk commented on issue #21348:
URL: https://github.com/apache/airflow/issues/21348#issuecomment-1031199630


   > Hi @potiuk - It is my first time doing this testing: Are there any 
specific steps we follow?
   
   Good question. Depends on your "local" environment. But what I would do, is 
to use:
   
   1. Start airflow 2.2.3 in breeze
   ```
   ./breeze start-airflow --use-airflow-version 2.2.3 --backend postgres 
--db-reset  --load-default-connections 
   ````
   
   This will start our dockerized development environment with Airflow 2.2.3 
installed and open 4 terminals: triggerer, scheduler, webserver and "bash 
console". 
   
   2. update the package to Rc: 
   
   In the console:
   ```
   pip install apache-airflow-providers-google==6.4.0rc1 
   ```
   
   Then restarting scheduler/webserver (by Ctrl+C) followed by "up cursor" to 
go back and run previous command 
   
   3. Prepare a test dag using calendar api and run it
   
   For that you likely need to configure the "google_default_connection" to 
include your credentials. You can put your dags in "files/dags" folder of your 
Airlfow sources (the "files" folder is mounted to inside the docker) and they 
should be scanned/visible in webserver


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to