eladkal commented on code in PR #63470:
URL: https://github.com/apache/airflow/pull/63470#discussion_r2982939259
##########
providers/snowflake/src/airflow/providers/snowflake/operators/snowflake.py:
##########
@@ -524,3 +524,70 @@ def on_kill(self) -> None:
self.log.info("Cancelling the query ids %s", self.query_ids)
self._hook.cancel_queries(self.query_ids)
self.log.info("Query ids %s cancelled successfully",
self.query_ids)
+
+
+class SnowflakeNotebookOperator(SnowflakeSqlApiOperator):
+ """
+ Execute a Snowflake Notebook via the Snowflake SQL API.
+
+ Builds an ``EXECUTE NOTEBOOK`` statement and delegates execution to
+
:class:`~airflow.providers.snowflake.operators.snowflake.SnowflakeSqlApiOperator`,
+ which handles query submission, polling, deferral, and cancellation.
+
+ The operator supports the following authentication methods via the
Snowflake connection:
+
+ - **Key pair**: provide ``private_key_file`` or ``private_key_content`` in
the connection extras.
+ - **OAuth**: provide ``refresh_token``, ``client_id``, and
``client_secret`` in the connection extras.
+ - **Programmatic Access Token (PAT)**: set ``authenticator`` to
``programmatic_access_token`` in
+ the connection extras and put the PAT value in the connection
``password`` field.
+
+ .. seealso::
+ `Snowflake EXECUTE NOTEBOOK
+ <https://docs.snowflake.com/en/sql-reference/sql/execute-notebook>`_
+
+ :param notebook: Fully-qualified notebook name
+ (e.g. ``MY_DB.MY_SCHEMA.MY_NOTEBOOK``).
+ :param parameters: Optional list of parameter strings to pass to the
+ notebook. Only string values are supported by Snowflake; other
+ data types are interpreted as NULL. Parameters are accessible in
+ the notebook via ``sys.argv``.
+ :param snowflake_conn_id: Reference to the Snowflake connection.
+ :param warehouse: Snowflake warehouse name (overrides connection default).
+ :param database: Snowflake database name (overrides connection default).
+ :param schema: Snowflake schema name (overrides connection default).
+ :param role: Snowflake role name (overrides connection default).
+ :param authenticator: Snowflake authenticator type.
+ :param session_parameters: Snowflake session-level parameters.
+ :param poll_interval: Seconds between status checks (default 5).
Review Comment:
```suggestion
:param poll_interval: Seconds between status checks (default 5). Used
only in deferrable mode.
```
I assume?
##########
providers/snowflake/src/airflow/providers/snowflake/operators/snowflake.py:
##########
@@ -524,3 +524,70 @@ def on_kill(self) -> None:
self.log.info("Cancelling the query ids %s", self.query_ids)
self._hook.cancel_queries(self.query_ids)
self.log.info("Query ids %s cancelled successfully",
self.query_ids)
+
+
+class SnowflakeNotebookOperator(SnowflakeSqlApiOperator):
+ """
+ Execute a Snowflake Notebook via the Snowflake SQL API.
+
+ Builds an ``EXECUTE NOTEBOOK`` statement and delegates execution to
+
:class:`~airflow.providers.snowflake.operators.snowflake.SnowflakeSqlApiOperator`,
+ which handles query submission, polling, deferral, and cancellation.
+
+ The operator supports the following authentication methods via the
Snowflake connection:
+
+ - **Key pair**: provide ``private_key_file`` or ``private_key_content`` in
the connection extras.
+ - **OAuth**: provide ``refresh_token``, ``client_id``, and
``client_secret`` in the connection extras.
+ - **Programmatic Access Token (PAT)**: set ``authenticator`` to
``programmatic_access_token`` in
+ the connection extras and put the PAT value in the connection
``password`` field.
+
+ .. seealso::
+ `Snowflake EXECUTE NOTEBOOK
+ <https://docs.snowflake.com/en/sql-reference/sql/execute-notebook>`_
Review Comment:
Do we need this here? feels more like things we should explain in the
provider connection docs?
##########
providers/snowflake/tests/system/snowflake/example_snowflake_notebook.py:
##########
@@ -0,0 +1,75 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied. See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""
+Example use of SnowflakeNotebookOperator.
+"""
+
+from __future__ import annotations
+
+import os
+from datetime import datetime
+
+from airflow import DAG
+from airflow.providers.snowflake.operators.snowflake import
SnowflakeNotebookOperator
+
+SNOWFLAKE_CONN_ID = "my_snowflake_conn"
+SNOWFLAKE_NOTEBOOK = os.environ.get("SNOWFLAKE_NOTEBOOK",
"MY_DB.MY_SCHEMA.MY_NOTEBOOK")
+ENV_ID = os.environ.get("SYSTEM_TESTS_ENV_ID")
+DAG_ID = "example_snowflake_notebook"
+
+with DAG(
+ DAG_ID,
+ start_date=datetime(2021, 1, 1),
+ default_args={"snowflake_conn_id": SNOWFLAKE_CONN_ID},
+ tags=["example"],
+ schedule="@once",
+ catchup=False,
+) as dag:
+ # [START howto_operator_snowflake_notebook]
Review Comment:
For examples to appear in the docs you need to reference this marker in rst
file.
example:
https://github.com/apache/airflow/blob/7fbebb4908f3d9b8b7a8c6b8697120c29b445a95/providers/snowflake/docs/operators/snowflake.rst#L50-L54
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]