ataulmujeeb-cyber opened a new pull request, #63286:
URL: https://github.com/apache/airflow/pull/63286
## Summary
Fixes #63285
The `_get_sql_endpoint_by_name` method in `DatabricksSqlHook` fails with
`AirflowException: Can't list Databricks SQL endpoints` whenever
`sql_warehouse_name` (or `sql_endpoint_name`) is used instead of `http_path`.
### Root Cause
The `LIST_SQL_ENDPOINTS_ENDPOINT` constant was updated to use the current
Databricks API path (`GET /api/2.0/sql/warehouses`), but the response parsing
in `_get_sql_endpoint_by_name` still checks for the legacy `"endpoints"` key.
The current API returns data under `"warehouses"`:
| API Path | Response JSON Key |
|---|---|
| `GET /api/2.0/sql/endpoints` (legacy) | `"endpoints"` |
| `GET /api/2.0/sql/warehouses` (current) | `"warehouses"` |
Since the code calls the **new** path but checks for the **old** key, the
condition `if "endpoints" not in result` is always `True`, and the method
always raises an exception.
### Fix
Updated `_get_sql_endpoint_by_name` to check for both `"warehouses"`
(current) and `"endpoints"` (legacy) response keys:
```python
warehouses = result.get("warehouses") or result.get("endpoints")
if not warehouses:
raise AirflowException("Can't list Databricks SQL warehouses")
```
This ensures backward compatibility if any Databricks workspace still
returns the legacy format.
### Changes
-
**`providers/databricks/src/airflow/providers/databricks/hooks/databricks_sql.py`**:
Updated `_get_sql_endpoint_by_name` to handle both `"warehouses"` and
`"endpoints"` response keys
-
**`providers/databricks/tests/unit/databricks/hooks/test_databricks_sql.py`**:
- Updated existing fixture to use the current `"warehouses"` response
format
- Added `TestGetSqlEndpointByName` test class with 4 tests covering:
- Current API response format (`"warehouses"` key)
- Legacy API response format (`"endpoints"` key)
- Warehouse name not found error
- Empty API response error
### Affected Components
This bug affects all operators and sensors that use `sql_warehouse_name` /
`sql_endpoint_name`:
- `DatabricksSqlSensor`
- `DatabricksPartitionSensor`
- `DatabricksSqlOperator`
- Direct usage of `DatabricksSqlHook` with `sql_endpoint_name`
### Versions Tested
- `apache-airflow-providers-databricks==7.9.1`
- Airflow SDK (Astronomer Runtime 3.1-13)
- Python 3.12
### Workaround (for users on affected versions)
Use `http_path` directly instead of `sql_warehouse_name`:
```python
# Instead of: sql_warehouse_name="My Warehouse"
http_path="/sql/1.0/warehouses/<warehouse_id>"
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]