cruseakshay commented on issue #62037:
URL: https://github.com/apache/airflow/issues/62037#issuecomment-3925703971
I reproduced this with a real Snowflake PAT token against Airflow 3.1.7 and
apache-airflow-providers-snowflake==6.9.0
Here's what I found:
**TL;DR:** SnowflakeHook works with PAT, Docs update needed to mention PAT
as a supported auth method via the password field.
SnowflakeSqlApiHook: needs code fix.
SnowflakeSqlApiOperator: Update stale docstring.
```python
from airflow.sdk import DAG, task
SNOWFLAKE_CONN_ID = "snowflake_pat_test"
@task
def test_snowflake_hook():
"""SnowflakeHook with PAT as password."""
from airflow.providers.snowflake.hooks.snowflake import SnowflakeHook
hook = SnowflakeHook(snowflake_conn_id=SNOWFLAKE_CONN_ID)
result = hook.run("SELECT 1 AS pat_test", handler=lambda cur:
cur.fetchone())
print(f"SnowflakeHook result: {result}")
return str(result)
@task
def test_sql_api_hook_execute_query():
"""SnowflakeSqlApiHook.execute_query() with PAT."""
from airflow.providers.snowflake.hooks.snowflake_sql_api import
SnowflakeSqlApiHook
hook = SnowflakeSqlApiHook(snowflake_conn_id=SNOWFLAKE_CONN_ID)
result = hook.execute_query("SELECT 1 AS pat_test", statement_count=1)
print(f"SnowflakeSqlApiHook query result: {result}")
return str(result)
with DAG(
dag_id="test_pat_auth",
schedule=None,
catchup=False,
):
test_snowflake_hook()
test_sql_api_hook_execute_query()
```
test_sql_api_hook_execute_query failed:
```
::group::Log message source
detailssources=["/opt/airflow/logs/dag_id=test_pat_auth/run_id=manual__2026-02-19T08:36:28.323393+00:00/task_id=test_sql_api_hook_execute_query/attempt=1.log"]
::endgroup::
[2026-02-19T08:36:28.659308Z] INFO - DAG bundles loaded: dags-folder
[2026-02-19T08:36:28.660607Z] INFO - Filling up the DagBag from
/opt/airflow/dags/test_pat_auth.py
[2026-02-19T08:36:29.159940Z] INFO - Creating JWTGenerator with arguments
account : xxxxx-xx1411, user : USER_XYXZ, lifetime : 0:59:00,
renewal_delay : 0:54:00
[2026-02-19T08:36:29.160039Z] INFO - Generating a new token because the
present time (2026-02-19 08:36:29.160019+00:00) is later than the renewal time
(2026-02-19 08:36:29.160017+00:00)
[2026-02-19T08:36:29.160147Z] ERROR - Task failed with
exceptionAttributeError: 'NoneType' object has no attribute 'public_key'
File
/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py,
line 1068 in run
File
/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py,
line 1477 in _execute_task
File
/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/operator.py,
line 417 in wrapper
File
/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/decorator.py,
line 252 in execute
File
/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/operator.py,
line 417 in wrapper
File
/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/standard/operators/python.py,
line 228 in execute
File
/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/standard/operators/python.py,
line 251 in execute_callable
File
/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/callback_runner.py,
line 82 in run
File /opt/airflow/dags/test_pat_auth.py, line 37 in
test_sql_api_hook_execute_query
File
/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/snowflake/hooks/snowflake_sql_api.py,
line 195 in execute_query
File
/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/snowflake/hooks/snowflake_sql_api.py,
line 259 in get_headers
File
/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/snowflake/utils/sql_api_generate_jwt.py,
line 133 in get_token
File
/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/snowflake/utils/sql_api_generate_jwt.py,
line 163 in calculate_public_key_fingerprint
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]