Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-13 Thread via GitHub


wgervasio commented on PR #46579:
URL: https://github.com/apache/airflow/pull/46579#issuecomment-2657902038

   insane PR mate


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-13 Thread via GitHub


boring-cyborg[bot] commented on PR #46579:
URL: https://github.com/apache/airflow/pull/46579#issuecomment-2657873322

   Awesome work, congrats on your first merged pull request! You are invited to 
check our [Issue Tracker](https://github.com/apache/airflow/issues) for 
additional contributions.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-13 Thread via GitHub


o-nikolas merged PR #46579:
URL: https://github.com/apache/airflow/pull/46579


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-13 Thread via GitHub


vincbeck closed pull request #46579: Add MwaaTriggerDagRunOperator and MwaaHook 
to Amazon Provider Package
URL: https://github.com/apache/airflow/pull/46579


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-12 Thread via GitHub


ramitkataria commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1953489637


##
providers/amazon/src/airflow/providers/amazon/aws/operators/mwaa.py:
##
@@ -0,0 +1,110 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA operators."""
+
+from __future__ import annotations
+
+from collections.abc import Sequence
+from typing import TYPE_CHECKING
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+from airflow.providers.amazon.aws.operators.base_aws import AwsBaseOperator
+from airflow.providers.amazon.aws.utils.mixins import aws_template_fields
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class MwaaTriggerDagRunOperator(AwsBaseOperator[MwaaHook]):
+"""
+Trigger a Dag Run for a Dag in an Amazon MWAA environment.
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:MwaaTriggerDagRunOperator`
+
+:param env_name: The MWAA environment name (templated)
+:param trigger_dag_id: The ID of the DAG to be triggered (templated)
+:param trigger_run_id: The Run ID. This together with trigger_dag_id are a 
unique key. (templated)
+:param logical_date: The logical date (previously called execution date). 
This is the time or interval
+covered by this DAG run, according to the DAG definition. This 
together with trigger_dag_id are a
+unique key. (templated)
+:param data_interval_start: The beginning of the interval the DAG run 
covers
+:param data_interval_end: The end of the interval the DAG run covers
+:param conf: Additional configuration parameters. The value of this field 
can be set only when creating
+the object. (templated)
+:param note: Contains manually entered notes by the user about the DagRun. 
(templated)
+"""
+
+aws_hook_class = MwaaHook
+template_fields: Sequence[str] = aws_template_fields(
+"env_name",
+"trigger_dag_id",
+"trigger_run_id",
+"logical_date",
+"data_interval_start",
+"data_interval_end",
+"conf",
+"note",
+)
+template_fields_renderers = {"conf": "json"}
+ui_color = "#6ad3fa"

Review Comment:
   Fixed!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-12 Thread via GitHub


vincbeck commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1953331074


##
providers/amazon/src/airflow/providers/amazon/aws/operators/mwaa.py:
##
@@ -0,0 +1,110 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA operators."""
+
+from __future__ import annotations
+
+from collections.abc import Sequence
+from typing import TYPE_CHECKING
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+from airflow.providers.amazon.aws.operators.base_aws import AwsBaseOperator
+from airflow.providers.amazon.aws.utils.mixins import aws_template_fields
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class MwaaTriggerDagRunOperator(AwsBaseOperator[MwaaHook]):
+"""
+Trigger a Dag Run for a Dag in an Amazon MWAA environment.
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:MwaaTriggerDagRunOperator`
+
+:param env_name: The MWAA environment name (templated)
+:param trigger_dag_id: The ID of the DAG to be triggered (templated)
+:param trigger_run_id: The Run ID. This together with trigger_dag_id are a 
unique key. (templated)
+:param logical_date: The logical date (previously called execution date). 
This is the time or interval
+covered by this DAG run, according to the DAG definition. This 
together with trigger_dag_id are a
+unique key. (templated)
+:param data_interval_start: The beginning of the interval the DAG run 
covers
+:param data_interval_end: The end of the interval the DAG run covers
+:param conf: Additional configuration parameters. The value of this field 
can be set only when creating
+the object. (templated)
+:param note: Contains manually entered notes by the user about the DagRun. 
(templated)
+"""
+
+aws_hook_class = MwaaHook
+template_fields: Sequence[str] = aws_template_fields(
+"env_name",
+"trigger_dag_id",
+"trigger_run_id",
+"logical_date",
+"data_interval_start",
+"data_interval_end",
+"conf",
+"note",
+)
+template_fields_renderers = {"conf": "json"}
+ui_color = "#6ad3fa"

Review Comment:
   As discussed offline, I dont think this is necessary



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-12 Thread via GitHub


ferruzzi commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1953297064


##
providers/src/airflow/providers/amazon/aws/operators/mwaa.py:
##
@@ -0,0 +1,111 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA operators."""
+
+from __future__ import annotations
+
+from collections.abc import Sequence
+from typing import TYPE_CHECKING
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+from airflow.providers.amazon.aws.operators.base_aws import AwsBaseOperator
+from airflow.providers.amazon.aws.utils.mixins import aws_template_fields
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class MwaaTriggerDagRunOperator(AwsBaseOperator[MwaaHook]):
+"""
+Trigger a Dag Run for a Dag in an Amazon MWAA environment.
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:MwaaTriggerDagRunOperator`
+
+:param env_name: The MWAA environment name (templated)
+:param trigger_dag_id: The ID of the DAG to be triggered (templated)
+:param trigger_run_id: The Run ID. The value of this field can be set only 
when creating the object. This
+together with trigger_dag_id are a unique key. (templated)
+:param logical_date: The logical date (previously called execution date). 
This is the time or interval
+covered by this DAG run, according to the DAG definition. The value of 
this field can be set only when
+creating the object. This together with trigger_dag_id are a unique 
key. (templated)
+:param data_interval_start: The beginning of the interval the DAG run 
covers
+:param data_interval_end: The end of the interval the DAG run covers
+:param conf: Additional configuration parameters. The value of this field 
can be set only when creating
+the object. (templated)
+:param note: Contains manually entered notes by the user about the DagRun. 
(templated)
+"""
+
+aws_hook_class = MwaaHook
+template_fields: Sequence[str] = aws_template_fields(
+"env_name",
+"trigger_dag_id",
+"trigger_run_id",
+"logical_date",
+"data_interval_start",
+"data_interval_end",
+"conf",
+"note",
+)
+template_fields_renderers = {"conf": "json"}
+ui_color = "#6ad3fa"
+
+def __init__(
+self,
+*,
+env_name: str,
+trigger_dag_id: str,
+trigger_run_id: str | None = None,
+logical_date: str | None = None,
+data_interval_start: str | None = None,
+data_interval_end: str | None = None,
+conf: dict | None = None,
+note: str | None = None,
+**kwargs,
+):
+super().__init__(**kwargs)

Review Comment:
   I can get behind that. :+1: 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-12 Thread via GitHub


ramitkataria commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1953213360


##
providers/src/airflow/providers/amazon/aws/operators/mwaa.py:
##
@@ -0,0 +1,111 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA operators."""
+
+from __future__ import annotations
+
+from collections.abc import Sequence
+from typing import TYPE_CHECKING
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+from airflow.providers.amazon.aws.operators.base_aws import AwsBaseOperator
+from airflow.providers.amazon.aws.utils.mixins import aws_template_fields
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class MwaaTriggerDagRunOperator(AwsBaseOperator[MwaaHook]):
+"""
+Trigger a Dag Run for a Dag in an Amazon MWAA environment.
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:MwaaTriggerDagRunOperator`
+
+:param env_name: The MWAA environment name (templated)
+:param trigger_dag_id: The ID of the DAG to be triggered (templated)
+:param trigger_run_id: The Run ID. The value of this field can be set only 
when creating the object. This
+together with trigger_dag_id are a unique key. (templated)
+:param logical_date: The logical date (previously called execution date). 
This is the time or interval
+covered by this DAG run, according to the DAG definition. The value of 
this field can be set only when
+creating the object. This together with trigger_dag_id are a unique 
key. (templated)
+:param data_interval_start: The beginning of the interval the DAG run 
covers
+:param data_interval_end: The end of the interval the DAG run covers
+:param conf: Additional configuration parameters. The value of this field 
can be set only when creating
+the object. (templated)
+:param note: Contains manually entered notes by the user about the DagRun. 
(templated)
+"""
+
+aws_hook_class = MwaaHook
+template_fields: Sequence[str] = aws_template_fields(
+"env_name",
+"trigger_dag_id",
+"trigger_run_id",
+"logical_date",
+"data_interval_start",
+"data_interval_end",
+"conf",
+"note",
+)
+template_fields_renderers = {"conf": "json"}
+ui_color = "#6ad3fa"
+
+def __init__(
+self,
+*,
+env_name: str,
+trigger_dag_id: str,
+trigger_run_id: str | None = None,
+logical_date: str | None = None,
+data_interval_start: str | None = None,
+data_interval_end: str | None = None,
+conf: dict | None = None,
+note: str | None = None,
+**kwargs,
+):
+super().__init__(**kwargs)

Review Comment:
   Yep, makes sense



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-12 Thread via GitHub


o-nikolas commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1953206276


##
providers/src/airflow/providers/amazon/aws/operators/mwaa.py:
##
@@ -0,0 +1,111 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA operators."""
+
+from __future__ import annotations
+
+from collections.abc import Sequence
+from typing import TYPE_CHECKING
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+from airflow.providers.amazon.aws.operators.base_aws import AwsBaseOperator
+from airflow.providers.amazon.aws.utils.mixins import aws_template_fields
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class MwaaTriggerDagRunOperator(AwsBaseOperator[MwaaHook]):
+"""
+Trigger a Dag Run for a Dag in an Amazon MWAA environment.
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:MwaaTriggerDagRunOperator`
+
+:param env_name: The MWAA environment name (templated)
+:param trigger_dag_id: The ID of the DAG to be triggered (templated)
+:param trigger_run_id: The Run ID. The value of this field can be set only 
when creating the object. This
+together with trigger_dag_id are a unique key. (templated)
+:param logical_date: The logical date (previously called execution date). 
This is the time or interval
+covered by this DAG run, according to the DAG definition. The value of 
this field can be set only when
+creating the object. This together with trigger_dag_id are a unique 
key. (templated)
+:param data_interval_start: The beginning of the interval the DAG run 
covers
+:param data_interval_end: The end of the interval the DAG run covers
+:param conf: Additional configuration parameters. The value of this field 
can be set only when creating
+the object. (templated)
+:param note: Contains manually entered notes by the user about the DagRun. 
(templated)
+"""
+
+aws_hook_class = MwaaHook
+template_fields: Sequence[str] = aws_template_fields(
+"env_name",
+"trigger_dag_id",
+"trigger_run_id",
+"logical_date",
+"data_interval_start",
+"data_interval_end",
+"conf",
+"note",
+)
+template_fields_renderers = {"conf": "json"}
+ui_color = "#6ad3fa"
+
+def __init__(
+self,
+*,
+env_name: str,
+trigger_dag_id: str,
+trigger_run_id: str | None = None,
+logical_date: str | None = None,
+data_interval_start: str | None = None,
+data_interval_end: str | None = None,
+conf: dict | None = None,
+note: str | None = None,
+**kwargs,
+):
+super().__init__(**kwargs)

Review Comment:
   I think a defferrable mode (with a sensor option also) would be interesting. 
You could use the API to check in on the status of the triggered dag and block 
until it's completion. The existing TriggerDagRunOperator has this 
functionality, so there is already a precedent for this: 
https://github.com/aws-mwaa/upstream-to-airflow/blob/9ba45d1611f97a62da055d9186a60416e1ce6f7c/providers/standard/src/airflow/providers/standard/operators/trigger_dagrun.py#L110
   
   BUT, I don't think that needs to block this initial PR, let's not boil the 
ocean. We can fast follow with this support after this is merged.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-11 Thread via GitHub


ramitkataria commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1950098702


##
providers/src/airflow/providers/amazon/aws/hooks/mwaa.py:
##
@@ -0,0 +1,81 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA hook."""
+
+from __future__ import annotations
+
+from botocore.exceptions import ClientError
+
+from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook
+
+
+class MwaaHook(AwsBaseHook):
+"""
+Interact with AWS Manager Workflows for Apache Airflow.
+
+Provide thin wrapper around :external+boto3:py:class:`boto3.client("mwaa") 
`
+
+Additional arguments (such as ``aws_conn_id``) may be specified and
+are passed down to the underlying AwsBaseHook.
+
+.. seealso::
+- :class:`airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
+"""
+
+def __init__(self, *args, **kwargs) -> None:
+kwargs["client_type"] = "mwaa"
+super().__init__(*args, **kwargs)
+
+def invoke_rest_api(
+self,
+env_name: str,
+path: str,
+method: str,
+body: dict | None = None,
+query_params: dict | None = None,
+) -> dict:
+"""
+Invoke the REST API on the Airflow webserver with the specified inputs.
+
+.. seealso::
+- :external+boto3:py:meth:`MWAA.Client.invoke_rest_api`
+
+:param env_name: name of the MWAA environment
+:param path: Apache Airflow REST API endpoint path to be called
+:param method: HTTP method used for making Airflow REST API calls
+:param body: Request body for the Apache Airflow REST API call
+:param query_params: Query parameters to be included in the Apache 
Airflow REST API call
+"""
+body = body or {}
+api_kwargs = {
+"Name": env_name,
+"Path": path,
+"Method": method,
+# Filter out keys with None values because Airflow REST API 
doesn't accept requests otherwise
+"Body": {k: v for k, v in body.items() if v is not None},
+"QueryParameters": query_params if query_params else {},
+}
+try:
+result = self.get_conn().invoke_rest_api(**api_kwargs)
+result.pop("ResponseMetadata", None)

Review Comment:
   Done!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-11 Thread via GitHub


ramitkataria commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1951802796


##
providers/tests/system/amazon/aws/example_mwaa.py:
##
@@ -0,0 +1,86 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from datetime import datetime
+
+from airflow.models.baseoperator import chain
+from airflow.models.dag import DAG
+from airflow.providers.amazon.aws.operators.mwaa import 
MwaaTriggerDagRunOperator
+
+from providers.tests.system.amazon.aws.utils import SystemTestContextBuilder
+
+DAG_ID = "example_mwaa"
+
+# Externally fetched variables:
+EXISTING_ENVIRONMENT_NAME_KEY = "ENVIRONMENT_NAME"
+EXISTING_DAG_ID_KEY = "DAG_ID"
+
+
+sys_test_context_task = (
+SystemTestContextBuilder()
+# NOTE: Creating a functional MWAA environment is time-consuming and 
requires
+# manually creating and configuring an S3 bucket for DAG storage and a VPC 
with
+# private subnets which is out of scope for this demo. To simplify this 
demo and
+# make it run in a reasonable time, an existing MWAA environment with is 
required

Review Comment:
   Yep 😅 fixed



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-11 Thread via GitHub


ramitkataria commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1951802226


##
providers/tests/amazon/aws/hooks/test_mwaa.py:
##
@@ -0,0 +1,130 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+
+import pytest
+from botocore.exceptions import ClientError
+from moto import mock_aws
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+
+ENV_NAME = "test_env"
+PATH = "/dags/test_dag/dagRuns"
+METHOD = "POST"
+QUERY_PARAMS = {"limit": 30}
+
+
+class TestMwaaHook:
+def setup_method(self):
+self.hook = MwaaHook()
+
+# these examples responses are included here instead of as a constant 
because the hook will mutate
+# responses causing subsequent tests to fail
+self.example_responses = {
+"success": {
+"ResponseMetadata": {
+"RequestId": "some ID",
+"HTTPStatusCode": 200,
+"HTTPHeaders": {"header1": "value1"},
+"RetryAttempts": 0,
+},
+"RestApiStatusCode": 200,
+"RestApiResponse": {
+"conf": {},
+"dag_id": "hello_world",
+"dag_run_id": "manual__2025-02-08T00:33:09.457198+00:00",
+"data_interval_end": "2025-02-08T00:33:09.457198+00:00",
+"data_interval_start": "2025-02-08T00:33:09.457198+00:00",
+"execution_date": "2025-02-08T00:33:09.457198+00:00",
+"external_trigger": True,
+"logical_date": "2025-02-08T00:33:09.457198+00:00",
+"run_type": "manual",
+"state": "queued",
+},
+},
+"failure": {
+"Error": {"Message": "", "Code": "RestApiClientException"},
+"ResponseMetadata": {
+"RequestId": "some ID",
+"HTTPStatusCode": 400,
+"HTTPHeaders": {"header1": "value1"},
+"RetryAttempts": 0,
+},
+"RestApiStatusCode": 404,
+"RestApiResponse": {
+"detail": "DAG with dag_id: 'hello_world1' not found",
+"status": 404,
+"title": "DAG not found",
+"type": 
"https://airflow.apache.org/docs/apache-airflow/2.10.3/stable-rest-api-ref.html#section/Errors/NotFound";,
+},
+},
+}
+
+def test_init(self):
+assert self.hook.client_type == "mwaa"
+
+@mock_aws
+def test_get_conn(self):
+assert self.hook.get_conn() is not None
+
+@pytest.mark.parametrize(
+"body",
+[
+None,  # test case: empty body
+{"conf": {}},  # test case: non-empty body
+],
+)

Review Comment:
   That looks much better, thanks!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-11 Thread via GitHub


ramitkataria commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1951800498


##
providers/src/airflow/providers/amazon/aws/hooks/mwaa.py:
##
@@ -0,0 +1,85 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA hook."""
+
+from __future__ import annotations
+
+from botocore.exceptions import ClientError
+
+from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook
+
+
+class MwaaHook(AwsBaseHook):
+"""
+Interact with AWS Manager Workflows for Apache Airflow.
+
+Provide thin wrapper around :external+boto3:py:class:`boto3.client("mwaa") 
`
+
+Additional arguments (such as ``aws_conn_id``) may be specified and
+are passed down to the underlying AwsBaseHook.
+
+.. seealso::
+- :class:`airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
+"""
+
+def __init__(self, *args, **kwargs) -> None:
+kwargs["client_type"] = "mwaa"
+super().__init__(*args, **kwargs)
+
+def invoke_rest_api(
+self,
+env_name: str,
+path: str,
+method: str,
+body: dict | None = None,
+query_params: dict | None = None,
+) -> dict:
+"""
+Invoke the REST API on the Airflow webserver with the specified inputs.
+
+.. seealso::
+- :external+boto3:py:meth:`MWAA.Client.invoke_rest_api`
+
+:param env_name: name of the MWAA environment
+:param path: Apache Airflow REST API endpoint path to be called
+:param method: HTTP method used for making Airflow REST API calls
+:param body: Request body for the Apache Airflow REST API call
+:param query_params: Query parameters to be included in the Apache 
Airflow REST API call
+"""
+body = body or {}
+api_kwargs = {
+"Name": env_name,
+"Path": path,
+"Method": method,
+# Filter out keys with None values because Airflow REST API 
doesn't accept requests otherwise
+"Body": {k: v for k, v in body.items() if v is not None},
+"QueryParameters": query_params if query_params else {},
+}
+try:
+result = self.get_conn().invoke_rest_api(**api_kwargs)

Review Comment:
   Just looked at `hooks/base_aws.py` and yes it does look like get_conn() 
exists just for compatibility now. So fixed, thanks! 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-11 Thread via GitHub


ferruzzi commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1951765833


##
providers/tests/amazon/aws/hooks/test_mwaa.py:
##
@@ -0,0 +1,130 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+
+import pytest
+from botocore.exceptions import ClientError
+from moto import mock_aws
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+
+ENV_NAME = "test_env"
+PATH = "/dags/test_dag/dagRuns"
+METHOD = "POST"
+QUERY_PARAMS = {"limit": 30}
+
+
+class TestMwaaHook:
+def setup_method(self):
+self.hook = MwaaHook()
+
+# these examples responses are included here instead of as a constant 
because the hook will mutate
+# responses causing subsequent tests to fail
+self.example_responses = {
+"success": {
+"ResponseMetadata": {
+"RequestId": "some ID",
+"HTTPStatusCode": 200,
+"HTTPHeaders": {"header1": "value1"},
+"RetryAttempts": 0,
+},
+"RestApiStatusCode": 200,
+"RestApiResponse": {
+"conf": {},
+"dag_id": "hello_world",
+"dag_run_id": "manual__2025-02-08T00:33:09.457198+00:00",
+"data_interval_end": "2025-02-08T00:33:09.457198+00:00",
+"data_interval_start": "2025-02-08T00:33:09.457198+00:00",
+"execution_date": "2025-02-08T00:33:09.457198+00:00",
+"external_trigger": True,
+"logical_date": "2025-02-08T00:33:09.457198+00:00",
+"run_type": "manual",
+"state": "queued",
+},
+},
+"failure": {
+"Error": {"Message": "", "Code": "RestApiClientException"},
+"ResponseMetadata": {
+"RequestId": "some ID",
+"HTTPStatusCode": 400,
+"HTTPHeaders": {"header1": "value1"},
+"RetryAttempts": 0,
+},
+"RestApiStatusCode": 404,
+"RestApiResponse": {
+"detail": "DAG with dag_id: 'hello_world1' not found",
+"status": 404,
+"title": "DAG not found",
+"type": 
"https://airflow.apache.org/docs/apache-airflow/2.10.3/stable-rest-api-ref.html#section/Errors/NotFound";,
+},
+},
+}
+
+def test_init(self):
+assert self.hook.client_type == "mwaa"
+
+@mock_aws
+def test_get_conn(self):
+assert self.hook.get_conn() is not None
+
+@pytest.mark.parametrize(
+"body",
+[
+None,  # test case: empty body
+{"conf": {}},  # test case: non-empty body
+],
+)

Review Comment:
   Since you added the comments there, I'll leave a nit/suggestion 9and it is 
just a suggestion):  you can use pytest.param, here if you'd like:
   ```suggestion
   @pytest.mark.parametrize(
   "body",
   [
   pytest.param(None, id="empty_body"),
   pytest.param({"conf": {}},  id="non_empty_body"),
   ],
   )
   ```
   
   When you run the tests with this change, the `id` gets added to the end of 
the test name so it'll show as `test_invoke_rest_api_success_empty_body` and 
`test_invoke_rest_api_success_non_empty_body` when you run the test in pytest 
instead of `test_invoke_rest_api_success_0` and 
`test_invoke_rest_api_success_1`.   It's a lile more useful.  If you are 
adding the comments anyway, and I agree with that addition, you may as well 
make it part of the useful code too.



##
providers/tests/amazon/aws/hooks/test_mwaa.py:
##
@@ -0,0 +1,130 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "Licens

Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-11 Thread via GitHub


ferruzzi commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1951769390


##
providers/tests/system/amazon/aws/example_mwaa.py:
##
@@ -0,0 +1,86 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from datetime import datetime
+
+from airflow.models.baseoperator import chain
+from airflow.models.dag import DAG
+from airflow.providers.amazon.aws.operators.mwaa import 
MwaaTriggerDagRunOperator
+
+from providers.tests.system.amazon.aws.utils import SystemTestContextBuilder
+
+DAG_ID = "example_mwaa"
+
+# Externally fetched variables:
+EXISTING_ENVIRONMENT_NAME_KEY = "ENVIRONMENT_NAME"
+EXISTING_DAG_ID_KEY = "DAG_ID"
+
+
+sys_test_context_task = (
+SystemTestContextBuilder()
+# NOTE: Creating a functional MWAA environment is time-consuming and 
requires
+# manually creating and configuring an S3 bucket for DAG storage and a VPC 
with
+# private subnets which is out of scope for this demo. To simplify this 
demo and
+# make it run in a reasonable time, an existing MWAA environment with is 
required

Review Comment:
   I think you got squirreled here.  "an existing MWAA environment with is 
required".   You forgot something there.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-11 Thread via GitHub


ferruzzi commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1951768225


##
providers/tests/amazon/aws/hooks/test_mwaa.py:
##
@@ -0,0 +1,130 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+
+import pytest
+from botocore.exceptions import ClientError
+from moto import mock_aws
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+
+ENV_NAME = "test_env"
+PATH = "/dags/test_dag/dagRuns"
+METHOD = "POST"
+QUERY_PARAMS = {"limit": 30}
+
+
+class TestMwaaHook:
+def setup_method(self):
+self.hook = MwaaHook()
+
+# these examples responses are included here instead of as a constant 
because the hook will mutate
+# responses causing subsequent tests to fail
+self.example_responses = {
+"success": {
+"ResponseMetadata": {
+"RequestId": "some ID",
+"HTTPStatusCode": 200,
+"HTTPHeaders": {"header1": "value1"},
+"RetryAttempts": 0,
+},
+"RestApiStatusCode": 200,
+"RestApiResponse": {
+"conf": {},
+"dag_id": "hello_world",
+"dag_run_id": "manual__2025-02-08T00:33:09.457198+00:00",
+"data_interval_end": "2025-02-08T00:33:09.457198+00:00",
+"data_interval_start": "2025-02-08T00:33:09.457198+00:00",
+"execution_date": "2025-02-08T00:33:09.457198+00:00",
+"external_trigger": True,
+"logical_date": "2025-02-08T00:33:09.457198+00:00",
+"run_type": "manual",
+"state": "queued",
+},
+},
+"failure": {
+"Error": {"Message": "", "Code": "RestApiClientException"},
+"ResponseMetadata": {
+"RequestId": "some ID",
+"HTTPStatusCode": 400,
+"HTTPHeaders": {"header1": "value1"},
+"RetryAttempts": 0,
+},
+"RestApiStatusCode": 404,
+"RestApiResponse": {
+"detail": "DAG with dag_id: 'hello_world1' not found",
+"status": 404,
+"title": "DAG not found",
+"type": 
"https://airflow.apache.org/docs/apache-airflow/2.10.3/stable-rest-api-ref.html#section/Errors/NotFound";,
+},
+},
+}
+
+def test_init(self):
+assert self.hook.client_type == "mwaa"
+
+@mock_aws
+def test_get_conn(self):
+assert self.hook.get_conn() is not None
+
+@pytest.mark.parametrize(
+"body",
+[
+None,  # test case: empty body
+{"conf": {}},  # test case: non-empty body
+],
+)
+@mock.patch.object(MwaaHook, "get_conn")
+def test_invoke_rest_api_success(self, mock_conn, body) -> None:
+boto_invoke_mock = 
mock.MagicMock(return_value=self.example_responses["success"])
+mock_conn.return_value.invoke_rest_api = boto_invoke_mock
+
+retval = self.hook.invoke_rest_api(ENV_NAME, PATH, METHOD, body, 
QUERY_PARAMS)
+kwargs_to_assert = {
+"Name": ENV_NAME,
+"Path": PATH,
+"Method": METHOD,
+"Body": body if body else {},
+"QueryParameters": QUERY_PARAMS,
+}
+boto_invoke_mock.assert_called_once_with(**kwargs_to_assert)
+assert retval == {
+k: v for k, v in self.example_responses["success"].items() if k != 
"ResponseMetadata"
+}
+
+@mock.patch.object(MwaaHook, "get_conn")

Review Comment:
   IF you end up changing get_conn for conn, remember to make the same changes 
to the tests down here.  If not, just resolve this comment.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please

Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-11 Thread via GitHub


ferruzzi commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1951765833


##
providers/tests/amazon/aws/hooks/test_mwaa.py:
##
@@ -0,0 +1,130 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+
+import pytest
+from botocore.exceptions import ClientError
+from moto import mock_aws
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+
+ENV_NAME = "test_env"
+PATH = "/dags/test_dag/dagRuns"
+METHOD = "POST"
+QUERY_PARAMS = {"limit": 30}
+
+
+class TestMwaaHook:
+def setup_method(self):
+self.hook = MwaaHook()
+
+# these examples responses are included here instead of as a constant 
because the hook will mutate
+# responses causing subsequent tests to fail
+self.example_responses = {
+"success": {
+"ResponseMetadata": {
+"RequestId": "some ID",
+"HTTPStatusCode": 200,
+"HTTPHeaders": {"header1": "value1"},
+"RetryAttempts": 0,
+},
+"RestApiStatusCode": 200,
+"RestApiResponse": {
+"conf": {},
+"dag_id": "hello_world",
+"dag_run_id": "manual__2025-02-08T00:33:09.457198+00:00",
+"data_interval_end": "2025-02-08T00:33:09.457198+00:00",
+"data_interval_start": "2025-02-08T00:33:09.457198+00:00",
+"execution_date": "2025-02-08T00:33:09.457198+00:00",
+"external_trigger": True,
+"logical_date": "2025-02-08T00:33:09.457198+00:00",
+"run_type": "manual",
+"state": "queued",
+},
+},
+"failure": {
+"Error": {"Message": "", "Code": "RestApiClientException"},
+"ResponseMetadata": {
+"RequestId": "some ID",
+"HTTPStatusCode": 400,
+"HTTPHeaders": {"header1": "value1"},
+"RetryAttempts": 0,
+},
+"RestApiStatusCode": 404,
+"RestApiResponse": {
+"detail": "DAG with dag_id: 'hello_world1' not found",
+"status": 404,
+"title": "DAG not found",
+"type": 
"https://airflow.apache.org/docs/apache-airflow/2.10.3/stable-rest-api-ref.html#section/Errors/NotFound";,
+},
+},
+}
+
+def test_init(self):
+assert self.hook.client_type == "mwaa"
+
+@mock_aws
+def test_get_conn(self):
+assert self.hook.get_conn() is not None
+
+@pytest.mark.parametrize(
+"body",
+[
+None,  # test case: empty body
+{"conf": {}},  # test case: non-empty body
+],
+)

Review Comment:
   Since you added the comments there, I'll leave a nit/suggestion:  you can 
use pytest.param, here if you'd like:
   ```suggestion
   @pytest.mark.parametrize(
   "body",
   [
   pytest.param(None, id="empty_body"),
   pytest.param({"conf": {}},  id="non_empty_body"),
   ],
   )
   ```
   
   When you run the tests with this change, the `id` gets added to the end of 
the test name so it'll show as `test_invoke_rest_api_success_empty_body` and 
`test_invoke_rest_api_success_non_empty_body` when you run the test in pytest 
instead of `test_invoke_rest_api_success_0` and 
`test_invoke_rest_api_success_0`.   It's a lile more useful.  If you are 
adding the comments anyway, and I agree with that addition, you may as well 
make it part of the useful code too.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-11 Thread via GitHub


ferruzzi commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1951761624


##
providers/tests/amazon/aws/hooks/test_mwaa.py:
##
@@ -0,0 +1,130 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+
+import pytest
+from botocore.exceptions import ClientError
+from moto import mock_aws
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+
+ENV_NAME = "test_env"
+PATH = "/dags/test_dag/dagRuns"
+METHOD = "POST"
+QUERY_PARAMS = {"limit": 30}
+
+
+class TestMwaaHook:
+def setup_method(self):
+self.hook = MwaaHook()
+
+# these examples responses are included here instead of as a constant 
because the hook will mutate
+# responses causing subsequent tests to fail

Review Comment:
   Nice comment, thanks.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-11 Thread via GitHub


ferruzzi commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1951761079


##
providers/src/airflow/providers/amazon/aws/operators/mwaa.py:
##
@@ -0,0 +1,111 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA operators."""
+
+from __future__ import annotations
+
+from collections.abc import Sequence
+from typing import TYPE_CHECKING
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+from airflow.providers.amazon.aws.operators.base_aws import AwsBaseOperator
+from airflow.providers.amazon.aws.utils.mixins import aws_template_fields
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class MwaaTriggerDagRunOperator(AwsBaseOperator[MwaaHook]):
+"""
+Trigger a Dag Run for a Dag in an Amazon MWAA environment.
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:MwaaTriggerDagRunOperator`
+
+:param env_name: The MWAA environment name (templated)
+:param trigger_dag_id: The ID of the DAG to be triggered (templated)
+:param trigger_run_id: The Run ID. The value of this field can be set only 
when creating the object. This
+together with trigger_dag_id are a unique key. (templated)
+:param logical_date: The logical date (previously called execution date). 
This is the time or interval
+covered by this DAG run, according to the DAG definition. The value of 
this field can be set only when
+creating the object. This together with trigger_dag_id are a unique 
key. (templated)
+:param data_interval_start: The beginning of the interval the DAG run 
covers
+:param data_interval_end: The end of the interval the DAG run covers
+:param conf: Additional configuration parameters. The value of this field 
can be set only when creating
+the object. (templated)
+:param note: Contains manually entered notes by the user about the DagRun. 
(templated)
+"""
+
+aws_hook_class = MwaaHook
+template_fields: Sequence[str] = aws_template_fields(
+"env_name",
+"trigger_dag_id",
+"trigger_run_id",
+"logical_date",
+"data_interval_start",
+"data_interval_end",
+"conf",
+"note",
+)
+template_fields_renderers = {"conf": "json"}
+ui_color = "#6ad3fa"
+
+def __init__(
+self,
+*,
+env_name: str,
+trigger_dag_id: str,
+trigger_run_id: str | None = None,
+logical_date: str | None = None,
+data_interval_start: str | None = None,
+data_interval_end: str | None = None,
+conf: dict | None = None,
+note: str | None = None,
+**kwargs,
+):
+super().__init__(**kwargs)

Review Comment:
   No support for deferrable mode?  I'm not sure it would even make sense in 
this case, just making sure it was considered.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-11 Thread via GitHub


ferruzzi commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1951760478


##
providers/src/airflow/providers/amazon/aws/hooks/mwaa.py:
##
@@ -0,0 +1,85 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA hook."""
+
+from __future__ import annotations
+
+from botocore.exceptions import ClientError
+
+from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook
+
+
+class MwaaHook(AwsBaseHook):
+"""
+Interact with AWS Manager Workflows for Apache Airflow.
+
+Provide thin wrapper around :external+boto3:py:class:`boto3.client("mwaa") 
`
+
+Additional arguments (such as ``aws_conn_id``) may be specified and
+are passed down to the underlying AwsBaseHook.
+
+.. seealso::
+- :class:`airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
+"""
+
+def __init__(self, *args, **kwargs) -> None:
+kwargs["client_type"] = "mwaa"
+super().__init__(*args, **kwargs)
+
+def invoke_rest_api(
+self,
+env_name: str,
+path: str,
+method: str,
+body: dict | None = None,
+query_params: dict | None = None,
+) -> dict:
+"""
+Invoke the REST API on the Airflow webserver with the specified inputs.
+
+.. seealso::
+- :external+boto3:py:meth:`MWAA.Client.invoke_rest_api`
+
+:param env_name: name of the MWAA environment
+:param path: Apache Airflow REST API endpoint path to be called
+:param method: HTTP method used for making Airflow REST API calls
+:param body: Request body for the Apache Airflow REST API call
+:param query_params: Query parameters to be included in the Apache 
Airflow REST API call
+"""
+body = body or {}
+api_kwargs = {
+"Name": env_name,
+"Path": path,
+"Method": method,
+# Filter out keys with None values because Airflow REST API 
doesn't accept requests otherwise
+"Body": {k: v for k, v in body.items() if v is not None},
+"QueryParameters": query_params if query_params else {},
+}
+try:
+result = self.get_conn().invoke_rest_api(**api_kwargs)

Review Comment:
   I think we are supposed to be phasing out `get_conn()` and using the cached 
`conn` property instead.   At least, that's what my notes say... but I don't 
know if that's stlil the case.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-11 Thread via GitHub


ramitkataria commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1951740741


##
providers/src/airflow/providers/amazon/aws/operators/mwaa.py:
##
@@ -0,0 +1,111 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA operators."""
+
+from __future__ import annotations
+
+from collections.abc import Sequence
+from typing import TYPE_CHECKING
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+from airflow.providers.amazon.aws.operators.base_aws import AwsBaseOperator
+from airflow.providers.amazon.aws.utils.mixins import aws_template_fields
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class MwaaTriggerDagRunOperator(AwsBaseOperator[MwaaHook]):
+"""
+Trigger a Dag Run for a Dag in an Amazon MWAA environment.
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:MwaaTriggerDagRunOperator`
+
+:param env_name: The MWAA environment name (templated)
+:param trigger_dag_id: The ID of the DAG to be triggered (templated)
+:param trigger_run_id: The Run ID. The value of this field can be set only 
when creating the object. This
+together with trigger_dag_id are a unique key. (templated)
+:param logical_date: The logical date (previously called execution date). 
This is the time or interval
+covered by this DAG run, according to the DAG definition. The value of 
this field can be set only when

Review Comment:
   Yea, thinking about the use case, I don't think this line would ever be 
useful so I think it's probably best to remove it



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-11 Thread via GitHub


o-nikolas commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1951398509


##
providers/src/airflow/providers/amazon/aws/operators/mwaa.py:
##
@@ -0,0 +1,111 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA operators."""
+
+from __future__ import annotations
+
+from collections.abc import Sequence
+from typing import TYPE_CHECKING
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+from airflow.providers.amazon.aws.operators.base_aws import AwsBaseOperator
+from airflow.providers.amazon.aws.utils.mixins import aws_template_fields
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class MwaaTriggerDagRunOperator(AwsBaseOperator[MwaaHook]):
+"""
+Trigger a Dag Run for a Dag in an Amazon MWAA environment.
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:MwaaTriggerDagRunOperator`
+
+:param env_name: The MWAA environment name (templated)
+:param trigger_dag_id: The ID of the DAG to be triggered (templated)
+:param trigger_run_id: The Run ID. The value of this field can be set only 
when creating the object. This
+together with trigger_dag_id are a unique key. (templated)
+:param logical_date: The logical date (previously called execution date). 
This is the time or interval
+covered by this DAG run, according to the DAG definition. The value of 
this field can be set only when

Review Comment:
   I don't think it's really relevant here, but someone with more experience 
with the rest api could confirm



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-10 Thread via GitHub


ramitkataria commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1950078891


##
providers/tests/system/amazon/aws/example_mwaa.py:
##
@@ -0,0 +1,87 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from datetime import datetime
+
+from airflow.models.baseoperator import chain
+from airflow.models.dag import DAG
+from airflow.providers.amazon.aws.operators.mwaa import 
MwaaTriggerDagRunOperator
+
+from providers.tests.system.amazon.aws.utils import SystemTestContextBuilder
+
+DAG_ID = "example_mwaa"
+
+# Externally fetched variables:
+EXISTING_ENVIRONMENT_NAME_KEY = "ENVIRONMENT_NAME"
+EXISTING_DAG_ID_KEY = "DAG_ID"
+
+
+sys_test_context_task = (
+SystemTestContextBuilder()
+# NOTE: Creating a functional MWAA environment is time-consuming and 
requires
+# manually creating and configuring an S3 bucket for DAG storage and a VPC 
with
+# private subnets which is out of scope for this demo. To simplify this 
demo and
+# make it run in a reasonable time, follow these steps in the AWS Console 
to create
+# a new MWAA environment with default configuration:
+# 1. Create an S3 bucket and upload your DAGs to a 'dags' directory
+# 2. Navigate to the MWAA console
+# 3. Create an environment, making sure to use the S3 bucket from the 
previous step
+# 4. Use the default VPC/network settings (or create a new one using this 
guide:
+# https://docs.aws.amazon.com/mwaa/latest/userguide/vpc-create.html)
+# 5. Select Public network for web server access and click through to 
creation

Review Comment:
   Thanks! I used the comment in `example_ecs.py` as inspiration so I included 
the steps. For the most part, using the console is straightforward except if 
the default VPC doesn't have a private subnet, it doesn't make it obvious why 
it's not letting the user select it. So maybe I'll just shorten this down to 
include this information



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-10 Thread via GitHub


ramitkataria commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1950074010


##
providers/src/airflow/providers/amazon/aws/operators/mwaa.py:
##
@@ -0,0 +1,111 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA operators."""
+
+from __future__ import annotations
+
+from collections.abc import Sequence
+from typing import TYPE_CHECKING
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+from airflow.providers.amazon.aws.operators.base_aws import AwsBaseOperator
+from airflow.providers.amazon.aws.utils.mixins import aws_template_fields
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class MwaaTriggerDagRunOperator(AwsBaseOperator[MwaaHook]):
+"""
+Trigger a Dag Run for a Dag in an Amazon MWAA environment.
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:MwaaTriggerDagRunOperator`
+
+:param env_name: The MWAA environment name (templated)
+:param trigger_dag_id: The ID of the DAG to be triggered (templated)
+:param trigger_run_id: The Run ID. The value of this field can be set only 
when creating the object. This
+together with trigger_dag_id are a unique key. (templated)
+:param logical_date: The logical date (previously called execution date). 
This is the time or interval
+covered by this DAG run, according to the DAG definition. The value of 
this field can be set only when

Review Comment:
   I got it from 
[here](https://airflow.apache.org/docs/apache-airflow/stable/stable-rest-api-ref.html#operation/post_dag_run).
 The docs make it seem like important info to me so I included it but I can 
remove it if you'd like



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-10 Thread via GitHub


o-nikolas commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1949653704


##
providers/src/airflow/providers/amazon/aws/hooks/mwaa.py:
##
@@ -0,0 +1,81 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA hook."""
+
+from __future__ import annotations
+
+from botocore.exceptions import ClientError
+
+from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook
+
+
+class MwaaHook(AwsBaseHook):
+"""
+Interact with AWS Manager Workflows for Apache Airflow.
+
+Provide thin wrapper around :external+boto3:py:class:`boto3.client("mwaa") 
`
+
+Additional arguments (such as ``aws_conn_id``) may be specified and
+are passed down to the underlying AwsBaseHook.
+
+.. seealso::
+- :class:`airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
+"""
+
+def __init__(self, *args, **kwargs) -> None:
+kwargs["client_type"] = "mwaa"
+super().__init__(*args, **kwargs)
+
+def invoke_rest_api(
+self,
+env_name: str,
+path: str,
+method: str,
+body: dict | None = None,
+query_params: dict | None = None,
+) -> dict:
+"""
+Invoke the REST API on the Airflow webserver with the specified inputs.
+
+.. seealso::
+- :external+boto3:py:meth:`MWAA.Client.invoke_rest_api`
+
+:param env_name: name of the MWAA environment
+:param path: Apache Airflow REST API endpoint path to be called
+:param method: HTTP method used for making Airflow REST API calls
+:param body: Request body for the Apache Airflow REST API call
+:param query_params: Query parameters to be included in the Apache 
Airflow REST API call
+"""
+body = body or {}
+api_kwargs = {
+"Name": env_name,
+"Path": path,
+"Method": method,
+# Filter out keys with None values because Airflow REST API 
doesn't accept requests otherwise
+"Body": {k: v for k, v in body.items() if v is not None},
+"QueryParameters": query_params if query_params else {},
+}
+try:
+result = self.get_conn().invoke_rest_api(**api_kwargs)
+result.pop("ResponseMetadata", None)

Review Comment:
   Can you add a comment describing why ResponseMetadata is being removed here, 
for posterity.
   
   Same in the exception handling block.



##
providers/src/airflow/providers/amazon/aws/operators/mwaa.py:
##
@@ -0,0 +1,111 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA operators."""
+
+from __future__ import annotations
+
+from collections.abc import Sequence
+from typing import TYPE_CHECKING
+
+from airflow.providers.amazon.aws.hooks.mwaa import MwaaHook
+from airflow.providers.amazon.aws.operators.base_aws import AwsBaseOperator
+from airflow.providers.amazon.aws.utils.mixins import aws_template_fields
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class MwaaTriggerDagRunOperator(AwsBaseOperator[MwaaHook]):
+"""
+Trigger a Dag Run for a Dag in an Amazon MWAA environment.
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:MwaaTriggerDagRunOperator`
+
+:param env_name: The MWAA environment name (templated)
+:param trigger_dag_id: The ID of the DAG

Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-10 Thread via GitHub


o-nikolas commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1949653353


##
providers/src/airflow/providers/amazon/aws/hooks/mwaa.py:
##
@@ -0,0 +1,81 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA hook."""
+
+from __future__ import annotations
+
+from botocore.exceptions import ClientError
+
+from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook
+
+
+class MwaaHook(AwsBaseHook):
+"""
+Interact with AWS Manager Workflows for Apache Airflow.
+
+Provide thin wrapper around :external+boto3:py:class:`boto3.client("mwaa") 
`
+
+Additional arguments (such as ``aws_conn_id``) may be specified and
+are passed down to the underlying AwsBaseHook.
+
+.. seealso::
+- :class:`airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
+"""
+
+def __init__(self, *args, **kwargs) -> None:
+kwargs["client_type"] = "mwaa"
+super().__init__(*args, **kwargs)
+
+def invoke_rest_api(
+self,
+env_name: str,
+path: str,
+method: str,
+body: dict | None = None,
+query_params: dict | None = None,
+) -> dict:
+"""
+Invoke the REST API on the Airflow webserver with the specified inputs.
+
+.. seealso::
+- :external+boto3:py:meth:`MWAA.Client.invoke_rest_api`
+
+:param env_name: name of the MWAA environment
+:param path: Apache Airflow REST API endpoint path to be called
+:param method: HTTP method used for making Airflow REST API calls
+:param body: Request body for the Apache Airflow REST API call
+:param query_params: Query parameters to be included in the Apache 
Airflow REST API call
+"""
+body = body or {}
+api_kwargs = {
+"Name": env_name,
+"Path": path,
+"Method": method,
+# Filter out keys with None values because Airflow REST API 
doesn't accept requests otherwise
+"Body": {k: v for k, v in body.items() if v is not None},
+"QueryParameters": query_params if query_params else {},
+}
+try:
+result = self.get_conn().invoke_rest_api(**api_kwargs)
+result.pop("ResponseMetadata", None)

Review Comment:
   Can you add a comment describing why ResponseMetadata is being removed here, 
for posterity.



##
providers/src/airflow/providers/amazon/aws/hooks/mwaa.py:
##
@@ -0,0 +1,81 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains AWS MWAA hook."""
+
+from __future__ import annotations
+
+from botocore.exceptions import ClientError
+
+from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook
+
+
+class MwaaHook(AwsBaseHook):
+"""
+Interact with AWS Manager Workflows for Apache Airflow.
+
+Provide thin wrapper around :external+boto3:py:class:`boto3.client("mwaa") 
`
+
+Additional arguments (such as ``aws_conn_id``) may be specified and
+are passed down to the underlying AwsBaseHook.
+
+.. seealso::
+- :class:`airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
+"""
+
+def __init__(self, *args, **kwargs) -> None:
+kwargs["client_type"] = "mwaa"
+super().__init__(*args, **kwargs)
+
+def invoke_rest_api(
+self,
+env_name: str,
+path: str,
+method: str,
+body: dict | None = None,
+  

Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-08 Thread via GitHub


eladkal commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1948016786


##
providers/tests/system/amazon/aws/example_mwaa.py:
##
@@ -0,0 +1,88 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from datetime import datetime
+
+from airflow.models.baseoperator import chain
+from airflow.models.dag import DAG
+from airflow.providers.amazon.aws.operators.mwaa import 
MwaaTriggerDagRunOperator
+
+from providers.tests.system.amazon.aws.utils import ENV_ID_KEY, 
SystemTestContextBuilder
+
+DAG_ID = "example_mwaa"
+
+# Externally fetched variables:
+EXISTING_ENVIRONMENT_NAME_KEY = "ENVIRONMENT_NAME"
+EXISTING_DAG_ID_KEY = "DAG_ID"
+
+
+sys_test_context_task = (
+SystemTestContextBuilder()
+# NOTE: Creating a functional MWAA environment is time-consuming and 
requires
+# manually creating and configuring an S3 bucket for DAG storage and a VPC 
with
+# private subnets which is out of scope for this demo. To simplify this 
demo and
+# make it run in a reasonable time, follow these steps in the AWS Console 
to create
+# a new MWAA environment with default configuration:
+# 1. Create an S3 bucket and upload your DAGs to a 'dags' directory
+# 2. Navigate to the MWAA console
+# 3. Create an environment, making sure to use the S3 bucket from the 
previous step
+# 4. Use the default VPC/network settings (or create a new one using this 
guide:
+# https://docs.aws.amazon.com/mwaa/latest/userguide/vpc-create.html)
+# 5. Select Public network for web server access and click through to 
creation
+.add_variable(EXISTING_ENVIRONMENT_NAME_KEY)
+.add_variable(EXISTING_DAG_ID_KEY)
+.build()
+)
+
+with DAG(
+dag_id=DAG_ID,
+schedule="@once",
+start_date=datetime(2021, 1, 1),
+tags=["example"],
+catchup=False,
+) as dag:
+test_context = sys_test_context_task()
+env_id = test_context[ENV_ID_KEY]
+env_name = test_context[EXISTING_ENVIRONMENT_NAME_KEY]
+trigger_dag_id = test_context[EXISTING_DAG_ID_KEY]
+
+# [START howto_operator_mwaa_trigger_dag_run]
+trigger_dag_run = MwaaTriggerDagRunOperator(
+task_id="trigger_dag_run",
+env_name=env_name,
+trigger_dag_id=trigger_dag_id,
+)

Review Comment:
   Please update docs to explain the difference with the regular tigger dag run 
operator



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-08 Thread via GitHub


ramitkataria commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1947997343


##
providers/tests/system/amazon/aws/example_mwaa.py:
##
@@ -0,0 +1,88 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from datetime import datetime
+
+from airflow.models.baseoperator import chain
+from airflow.models.dag import DAG
+from airflow.providers.amazon.aws.operators.mwaa import 
MwaaTriggerDagRunOperator
+
+from providers.tests.system.amazon.aws.utils import ENV_ID_KEY, 
SystemTestContextBuilder
+
+DAG_ID = "example_mwaa"
+
+# Externally fetched variables:
+EXISTING_ENVIRONMENT_NAME_KEY = "ENVIRONMENT_NAME"
+EXISTING_DAG_ID_KEY = "DAG_ID"
+
+
+sys_test_context_task = (
+SystemTestContextBuilder()
+# NOTE: Creating a functional MWAA environment is time-consuming and 
requires
+# manually creating and configuring an S3 bucket for DAG storage and a VPC 
with
+# private subnets which is out of scope for this demo. To simplify this 
demo and
+# make it run in a reasonable time, follow these steps in the AWS Console 
to create
+# a new MWAA environment with default configuration:
+# 1. Create an S3 bucket and upload your DAGs to a 'dags' directory
+# 2. Navigate to the MWAA console
+# 3. Create an environment, making sure to use the S3 bucket from the 
previous step
+# 4. Use the default VPC/network settings (or create a new one using this 
guide:
+# https://docs.aws.amazon.com/mwaa/latest/userguide/vpc-create.html)
+# 5. Select Public network for web server access and click through to 
creation
+.add_variable(EXISTING_ENVIRONMENT_NAME_KEY)
+.add_variable(EXISTING_DAG_ID_KEY)
+.build()
+)
+
+with DAG(
+dag_id=DAG_ID,
+schedule="@once",
+start_date=datetime(2021, 1, 1),
+tags=["example"],
+catchup=False,
+) as dag:
+test_context = sys_test_context_task()
+env_id = test_context[ENV_ID_KEY]
+env_name = test_context[EXISTING_ENVIRONMENT_NAME_KEY]
+trigger_dag_id = test_context[EXISTING_DAG_ID_KEY]
+
+# [START howto_operator_mwaa_trigger_dag_run]
+trigger_dag_run = MwaaTriggerDagRunOperator(
+task_id="trigger_dag_run",
+env_name=env_name,
+trigger_dag_id=trigger_dag_id,
+)

Review Comment:
   Yes exactly, assuming at least environment B is running on AWS MWAA. And as 
for the hook, it would let you perform any API action on environment B.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add MwaaTriggerDagRunOperator and MwaaHook to Amazon Provider Package [airflow]

2025-02-07 Thread via GitHub


eladkal commented on code in PR #46579:
URL: https://github.com/apache/airflow/pull/46579#discussion_r1947497344


##
providers/tests/system/amazon/aws/example_mwaa.py:
##
@@ -0,0 +1,88 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from datetime import datetime
+
+from airflow.models.baseoperator import chain
+from airflow.models.dag import DAG
+from airflow.providers.amazon.aws.operators.mwaa import 
MwaaTriggerDagRunOperator
+
+from providers.tests.system.amazon.aws.utils import ENV_ID_KEY, 
SystemTestContextBuilder
+
+DAG_ID = "example_mwaa"
+
+# Externally fetched variables:
+EXISTING_ENVIRONMENT_NAME_KEY = "ENVIRONMENT_NAME"
+EXISTING_DAG_ID_KEY = "DAG_ID"
+
+
+sys_test_context_task = (
+SystemTestContextBuilder()
+# NOTE: Creating a functional MWAA environment is time-consuming and 
requires
+# manually creating and configuring an S3 bucket for DAG storage and a VPC 
with
+# private subnets which is out of scope for this demo. To simplify this 
demo and
+# make it run in a reasonable time, follow these steps in the AWS Console 
to create
+# a new MWAA environment with default configuration:
+# 1. Create an S3 bucket and upload your DAGs to a 'dags' directory
+# 2. Navigate to the MWAA console
+# 3. Create an environment, making sure to use the S3 bucket from the 
previous step
+# 4. Use the default VPC/network settings (or create a new one using this 
guide:
+# https://docs.aws.amazon.com/mwaa/latest/userguide/vpc-create.html)
+# 5. Select Public network for web server access and click through to 
creation
+.add_variable(EXISTING_ENVIRONMENT_NAME_KEY)
+.add_variable(EXISTING_DAG_ID_KEY)
+.build()
+)
+
+with DAG(
+dag_id=DAG_ID,
+schedule="@once",
+start_date=datetime(2021, 1, 1),
+tags=["example"],
+catchup=False,
+) as dag:
+test_context = sys_test_context_task()
+env_id = test_context[ENV_ID_KEY]
+env_name = test_context[EXISTING_ENVIRONMENT_NAME_KEY]
+trigger_dag_id = test_context[EXISTING_DAG_ID_KEY]
+
+# [START howto_operator_mwaa_trigger_dag_run]
+trigger_dag_run = MwaaTriggerDagRunOperator(
+task_id="trigger_dag_run",
+env_name=env_name,
+trigger_dag_id=trigger_dag_id,
+)

Review Comment:
   Is the use case here that you running the dag in environment A yet want to 
trigger another DAG in environment B?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org