Yes. Using ProvidersManager is the best option. And the easiest way is
to do (example for databricks):

```
info = ProvidersManager().providers['apache-airflow-providers-databricks']
```

this will return ProviderInfo class (providers is a str ->
ProviderInfo dict) which keeps information about the provider
including version and "provider_info.schema.json" compliant "data"
dictionary and information where the dictionary comes from :

* "package" - if the provider is installed as "package" (usually in
released version)
* "source" - if the provider is installed in development mode directly
from airflow sources (which is the usual case for breeze and local
virtualenv when you install airflow with `pip install -e`

J.


On Sun, Jul 10, 2022 at 3:09 PM Alex Ott <[email protected]> wrote:
>
> Hello
>
> I want to add to Databricks more details about version of Airflow & provider 
> itself, so we can understand what  Airflow & provider versions are used and 
> plan backports, etc.
>
> I see that we can get list of provider versions via get_provider_info 
> function that is automatically generated when release is done. What would be 
> the recommended way of extracting version information that will work for both 
> released & dev versions of provider?
>
> I see that DBT provider uses following code:
>
> def _get_provider_info() -> Tuple[str, str]:
>     from airflow.providers_manager import ProvidersManager
>
>     manager = ProvidersManager()
>     package_name = manager.hooks[DbtCloudHook.conn_type].package_name  # 
> type: ignore[union-attr]
>     provider = manager.providers[package_name]
>
>     return package_name, provider.version
>
> Would it be recommended approach?
>
> --
> With best wishes,                    Alex Ott
> http://alexott.net/
> Twitter: alexott_en (English), alexott (Russian)

Reply via email to