[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-04-12 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1164562801


##
docs/apache-airflow-providers-amazon/operators/dynamodb.rst:
##
@@ -0,0 +1,55 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+===
+Amazon DynamoDB
+===
+
+`Amazon DynamoDB `__ Amazon DynamoDB is a
+fully managed, serverless, key-value NoSQL database designed to run
+high-performance applications at any scale. DynamoDB offers built-in security,
+continuous backups, automated multi-Region replication, in-memory caching, and
+data import and export tools.

Review Comment:
   Cool, works for me. :+1: 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-04-12 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1164555934


##
docs/apache-airflow-providers-amazon/operators/dynamodb.rst:
##
@@ -0,0 +1,55 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+===
+Amazon DynamoDB
+===
+
+`Amazon DynamoDB `__ Amazon DynamoDB is a
+fully managed, serverless, key-value NoSQL database designed to run
+high-performance applications at any scale. DynamoDB offers built-in security,
+continuous backups, automated multi-Region replication, in-memory caching, and
+data import and export tools.

Review Comment:
   Just a question, where did this blurb come from?  I usually pull the blurb 
from the official [AWS docs page](https://docs.aws.amazon.com/dynamodb/) for 
the service so it has their wording, but that isn't always necessary.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-04-12 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1164537652


##
dags/example_dynamodb.py:
##
@@ -0,0 +1,110 @@
+# Licensed to the Apache Software Foundation (ASF) under one

Review Comment:
   And you'll need to add the START and END tags for the code snippet(s) for 
docs page, but I presume that is Coming Soon :tm:.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-04-12 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1164536631


##
dags/example_dynamodb.py:
##
@@ -0,0 +1,110 @@
+# Licensed to the Apache Software Foundation (ASF) under one

Review Comment:
   Other than the one addition below, this file should live in 
`tests/system/providers/amazon/aws/example_dynamodb.py`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-04-12 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1164534885


##
dags/example_dynamodb.py:
##
@@ -0,0 +1,110 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from datetime import datetime
+
+import boto3
+from airflow.decorators import task
+from airflow.models.baseoperator import chain
+from airflow import DAG
+
+from airflow.providers.amazon.aws.sensors.dynamodb import DynamoDBValueSensor
+from tests.system.providers.amazon.aws.utils import ENV_ID_KEY, 
SystemTestContextBuilder
+
+DAG_ID = "example_dynamodbvaluesensor"
+sys_test_context_task = SystemTestContextBuilder().build()
+
+TABLE_ATTRIBUTES = [
+{"AttributeName": "PK", "AttributeType": "S"},
+{"AttributeName": "SK", "AttributeType": "S"},
+]
+TABLE_KEY_SCHEMA = [
+{"AttributeName": "PK", "KeyType": "HASH"},
+{"AttributeName": "SK", "KeyType": "RANGE"},
+]
+TABLE_THROUGHPUT = {"ReadCapacityUnits": 10, "WriteCapacityUnits": 10}
+
+
+@task
+def create_table(table_name: str):
+ddb = boto3.resource("dynamodb")
+table = ddb.create_table(
+AttributeDefinitions=TABLE_ATTRIBUTES,
+TableName=table_name,
+KeySchema=TABLE_KEY_SCHEMA,
+ProvisionedThroughput=TABLE_THROUGHPUT,
+)
+boto3.client("dynamodb").get_waiter("table_exists").wait()
+table.put_item(Item={"PK": "Test", "SK": "2022-07-12T11:11:25-0400", 
"Value": "Testing"})
+
+
+@task

Review Comment:
   ```suggestion
   @task(trigger_rule=TriggerRule.ALL_DONE)
   ```
   
   This way, even if the test fails it will clean up the table.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-04-04 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1157996409


##
tests/providers/amazon/aws/sensors/test_dynamodb.py:
##
@@ -0,0 +1,64 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import annotations
+
+from unittest.mock import Mock, patch, PropertyMock
+
+from airflow.providers.amazon.aws.sensors.dynamodb import DynamoDBValueSensor
+from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
+from moto import mock_dynamodb
+
+
+class TestDynamoDBValueSensor:
+def setup_method(self):
+self.table_name = "test_airflow"
+self.key_name = "PK"
+self.key_value = "Test"
+self.attribute_name = "Foo"
+self.attribute_value = "Bar"
+
+self.sensor = DynamoDBValueSensor(
+task_id="dynamodb_value_sensor",
+table_name=self.table_name,
+partition_key_name=self.key_name,
+partition_key_value=self.key_value,
+attribute_name=self.attribute_name,
+attribute_value=self.attribute_value,
+)
+
+@mock_dynamodb
+def test_sensor_with_pk(self):
+hook = DynamoDBHook(table_name=self.table_name, 
table_keys=[self.key_name])
+
+hook.conn.create_table(
+TableName=self.table_name,
+KeySchema=[{"AttributeName": self.key_name, "KeyType": "HASH"}],
+AttributeDefinitions=[{"AttributeName": self.key_name, 
"AttributeType": "S"}],
+ProvisionedThroughput={"ReadCapacityUnits": 10, 
"WriteCapacityUnits": 10},
+)
+
+items = [{self.key_name: self.key_value, self.attribute_name: 
self.attribute_value}]
+hook.write_batch_data(items)
+
+assert self.sensor.poke({})

Review Comment:
   You can use `poke(None)` here.  My IDE gets grumpy with that and likes 
`poke({})` but AFAICT they are functionally identical.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-04-04 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1157996021


##
tests/providers/amazon/aws/sensors/test_dynamodb.py:
##
@@ -0,0 +1,64 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import annotations
+
+from unittest.mock import Mock, patch, PropertyMock
+
+from airflow.providers.amazon.aws.sensors.dynamodb import DynamoDBValueSensor
+from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
+from moto import mock_dynamodb
+
+
+class TestDynamoDBValueSensor:
+def setup_method(self):
+self.table_name = "test_airflow"
+self.key_name = "PK"
+self.key_value = "Test"
+self.attribute_name = "Foo"
+self.attribute_value = "Bar"
+
+self.sensor = DynamoDBValueSensor(
+task_id="dynamodb_value_sensor",
+table_name=self.table_name,
+partition_key_name=self.key_name,
+partition_key_value=self.key_value,
+attribute_name=self.attribute_name,
+attribute_value=self.attribute_value,
+)
+
+@mock_dynamodb
+def test_sensor_with_pk(self):
+hook = DynamoDBHook(table_name=self.table_name, 
table_keys=[self.key_name])
+
+hook.conn.create_table(
+TableName=self.table_name,
+KeySchema=[{"AttributeName": self.key_name, "KeyType": "HASH"}],
+AttributeDefinitions=[{"AttributeName": self.key_name, 
"AttributeType": "S"}],
+ProvisionedThroughput={"ReadCapacityUnits": 10, 
"WriteCapacityUnits": 10},
+)
+
+items = [{self.key_name: self.key_value, self.attribute_name: 
self.attribute_value}]
+hook.write_batch_data(items)
+
+assert self.sensor.poke({})
+
+mock_conn.Table = Mock()
+mock_conn.Table.get_item = Mock(return_value=response)
+
+assert self.sensor.poke(None)

Review Comment:
   These lines shouldn't be needed anymore, unless I'm missing something.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-04-04 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1157995247


##
tests/providers/amazon/aws/sensors/test_dynamodb.py:
##
@@ -0,0 +1,64 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import annotations
+
+from unittest.mock import Mock, patch, PropertyMock
+
+from airflow.providers.amazon.aws.sensors.dynamodb import DynamoDBValueSensor
+from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
+from moto import mock_dynamodb
+
+
+class TestDynamoDBValueSensor:
+def setup_method(self):
+self.table_name = "test_airflow"
+self.key_name = "PK"
+self.key_value = "Test"
+self.attribute_name = "Foo"
+self.attribute_value = "Bar"
+
+self.sensor = DynamoDBValueSensor(
+task_id="dynamodb_value_sensor",
+table_name=self.table_name,
+partition_key_name=self.key_name,
+partition_key_value=self.key_value,
+attribute_name=self.attribute_name,
+attribute_value=self.attribute_value,
+)
+
+@mock_dynamodb
+def test_sensor_with_pk(self):

Review Comment:
   Your original test had a PK and an SK, which was a good idea.  Maybe 
copy/paste this and make a second test that has both?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-04-04 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1157995247


##
tests/providers/amazon/aws/sensors/test_dynamodb.py:
##
@@ -0,0 +1,64 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import annotations
+
+from unittest.mock import Mock, patch, PropertyMock
+
+from airflow.providers.amazon.aws.sensors.dynamodb import DynamoDBValueSensor
+from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
+from moto import mock_dynamodb
+
+
+class TestDynamoDBValueSensor:
+def setup_method(self):
+self.table_name = "test_airflow"
+self.key_name = "PK"
+self.key_value = "Test"
+self.attribute_name = "Foo"
+self.attribute_value = "Bar"
+
+self.sensor = DynamoDBValueSensor(
+task_id="dynamodb_value_sensor",
+table_name=self.table_name,
+partition_key_name=self.key_name,
+partition_key_value=self.key_value,
+attribute_name=self.attribute_name,
+attribute_value=self.attribute_value,
+)
+
+@mock_dynamodb
+def test_sensor_with_pk(self):

Review Comment:
   Your original test had a PK and an SK.  Maybe copy/paste this and make a 
second test that has both?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-04-04 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1157994954


##
tests/providers/amazon/aws/sensors/test_dynamodb.py:
##
@@ -0,0 +1,129 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+from unittest.mock import MagicMock
+from pprint import pprint
+
+import boto3
+import pytest
+from moto import mock_dynamodb
+
+# from airflow.exceptions import AirflowException
+from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
+from airflow.providers.amazon.aws.sensors.dynamodb import DynamoDBValueSensor
+
+AWS_CONN_ID = "aws_default"
+REGION_NAME = "us-east-1"
+TABLE_NAME = "test_airflow"
+TASK_ID = "dynamodb_value_sensor"
+
+# breeze testing tests ./tests/providers/amazon/aws/sensors/test_dynamodb.py 
--log-cli-level=INFO
+
+
+@pytest.fixture
+def use_moto():
+@mock_dynamodb
+def dynamodb_client():
+dynamodb = boto3.resource("dynamodb", region_name=REGION_NAME)
+
+dynamodb.create_table(
+TableName=TABLE_NAME,
+KeySchema=[
+{"AttributeName": "PK", "KeyType": "HASH"},
+{"AttributeName": "SK", "KeyType": "RANGE"},
+],
+AttributeDefinitions=[
+{"AttributeName": "PK", "AttributeType": "S"},
+{"AttributeName": "SK", "AttributeType": "S"},
+],
+BillingMode="PAY_PER_REQUEST",
+)
+return dynamodb
+
+return dynamodb_client
+
+
+class TestDynamoDBValueSensor:
+def setup_method(self):
+self.mock_context = MagicMock()
+
+def test_init(self):
+sensor = DynamoDBValueSensor(
+task_id=TASK_ID,
+table_name=TABLE_NAME,
+partition_key_name="PK",
+partition_key_value="Test",
+sort_key_name="SK",
+sort_key_value="2022-07-12T11:11:25-0400",
+attribute_name="Foo",
+attribute_value="Bar",
+aws_conn_id=AWS_CONN_ID,
+region_name=REGION_NAME,
+)
+
+assert TASK_ID == sensor.task_id
+assert AWS_CONN_ID == sensor.aws_conn_id
+assert REGION_NAME == sensor.region_name
+
+@mock_dynamodb
+def test_get_conn_returns_a_boto3_connection(self):
+hook = DynamoDBHook(aws_conn_id=AWS_CONN_ID)
+assert hook.get_conn() is not None
+
+@mock_dynamodb
+@mock.patch("airflow.providers.amazon.aws.sensors.dynamodb.DynamoDBHook")
+def test_sensor_with_pk_and_sk(self, ddb_mock):
+pprint(ddb_mock)
+
+hook = DynamoDBHook(
+aws_conn_id=AWS_CONN_ID, table_name=TABLE_NAME, table_keys=["PK"], 
region_name=REGION_NAME
+)
+
+hook.get_conn().create_table(
+TableName=TABLE_NAME,
+KeySchema=[
+{"AttributeName": "PK", "KeyType": "HASH"},
+],
+AttributeDefinitions=[{"AttributeName": "PK", "AttributeType": 
"S"}],
+ProvisionedThroughput={"ReadCapacityUnits": 10, 
"WriteCapacityUnits": 10},
+)
+
+table = hook.get_conn().Table(TABLE_NAME)
+table.meta.client.get_waiter("table_exists").wait(TableName=TABLE_NAME)
+
+assert table.table_status == "ACTIVE"
+
+sensor = DynamoDBValueSensor(
+task_id=TASK_ID,
+poke_interval=30,
+timeout=120,
+soft_fail=False,
+retries=10,
+table_name=TABLE_NAME,  # replace with your table name
+partition_key_name="PK",  # replace with your partition key name
+partition_key_value="Test",  # replace with your partition key 
value
+sort_key_name="SK",  # replace with your sort key name (if 
applicable)
+sort_key_value="2023-03-28T11:11:25-0400",  # replace with your 
sort key value (if applicable)
+attribute_name="Foo",  # replace with the attribute name to wait 
for
+attribute_value="Bar",  # replace with the attribute value to wait 
for (sensor will return true when this value matches the attribute value in the 
item)
+)
+

Review Comment:
   You can resolve this thread so it's 

[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-04-04 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1157750214


##
airflow/providers/amazon/aws/sensors/dynamodb.py:
##
@@ -0,0 +1,91 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from typing import TYPE_CHECKING, Any, Optional
+
+from airflow.compat.functools import cached_property
+from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
+from airflow.sensors.base import BaseSensorOperator
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class DynamoDBValueSensor(BaseSensorOperator):
+"""
+Waits for an attribute value to be present for an item in a DynamoDB table.
+
+:param partition_key_name: DynamoDB partition key name
+:param partition_key_value: DynamoDB partition key value
+:param attribute_name: DynamoDB attribute name
+:param attribute_value: DynamoDB attribute value
+:param sort_key_name: (optional) DynamoDB sort key name
+:param sort_key_value: (optional) DynamoDB sort key value
+"""
+
+def __init__(
+self,
+table_name: str,
+partition_key_name: str,
+partition_key_value: str,
+attribute_name: str,
+attribute_value: str,
+sort_key_name: Optional[str] = None,
+sort_key_value: Optional[str] = None,
+aws_conn_id: str | None = DynamoDBHook.default_conn_name,
+region_name: str | None = None,
+**kwargs: Any,
+):
+super().__init__(**kwargs)
+self.table_name = table_name
+self.partition_key_name = partition_key_name
+self.partition_key_value = partition_key_value
+self.attribute_name = attribute_name
+self.attribute_value = attribute_value
+self.sort_key_name = sort_key_name
+self.sort_key_value = sort_key_value
+self.aws_conn_id = aws_conn_id
+self.region_name = region_name
+
+def poke(self, context: Context) -> bool:
+"""Test DynamoDB item for matching attribute value"""
+key = {self.partition_key_name: self.partition_key_value}
+msg = (
+f"Checking table {self.table_name} for"

Review Comment:
   Not sure if this is for your own debugging or if it is going to stay, but 
add a space at the end of this line or you end up with `... foritem ...` in the 
output
   ```suggestion
   f"Checking table {self.table_name} for "
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-03-29 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1152368101


##
tests/providers/amazon/aws/sensors/test_dynamodb.py:
##
@@ -0,0 +1,129 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+from unittest.mock import MagicMock
+from pprint import pprint
+
+import boto3
+import pytest
+from moto import mock_dynamodb
+
+# from airflow.exceptions import AirflowException
+from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
+from airflow.providers.amazon.aws.sensors.dynamodb import DynamoDBValueSensor
+
+AWS_CONN_ID = "aws_default"
+REGION_NAME = "us-east-1"
+TABLE_NAME = "test_airflow"
+TASK_ID = "dynamodb_value_sensor"
+
+# breeze testing tests ./tests/providers/amazon/aws/sensors/test_dynamodb.py 
--log-cli-level=INFO
+
+
+@pytest.fixture
+def use_moto():
+@mock_dynamodb
+def dynamodb_client():
+dynamodb = boto3.resource("dynamodb", region_name=REGION_NAME)
+
+dynamodb.create_table(
+TableName=TABLE_NAME,
+KeySchema=[
+{"AttributeName": "PK", "KeyType": "HASH"},
+{"AttributeName": "SK", "KeyType": "RANGE"},
+],
+AttributeDefinitions=[
+{"AttributeName": "PK", "AttributeType": "S"},
+{"AttributeName": "SK", "AttributeType": "S"},
+],
+BillingMode="PAY_PER_REQUEST",
+)
+return dynamodb
+
+return dynamodb_client
+
+
+class TestDynamoDBValueSensor:
+def setup_method(self):
+self.mock_context = MagicMock()
+
+def test_init(self):
+sensor = DynamoDBValueSensor(
+task_id=TASK_ID,
+table_name=TABLE_NAME,
+partition_key_name="PK",
+partition_key_value="Test",
+sort_key_name="SK",
+sort_key_value="2022-07-12T11:11:25-0400",
+attribute_name="Foo",
+attribute_value="Bar",
+aws_conn_id=AWS_CONN_ID,
+region_name=REGION_NAME,
+)
+
+assert TASK_ID == sensor.task_id
+assert AWS_CONN_ID == sensor.aws_conn_id
+assert REGION_NAME == sensor.region_name
+
+@mock_dynamodb
+def test_get_conn_returns_a_boto3_connection(self):
+hook = DynamoDBHook(aws_conn_id=AWS_CONN_ID)
+assert hook.get_conn() is not None
+
+@mock_dynamodb
+@mock.patch("airflow.providers.amazon.aws.sensors.dynamodb.DynamoDBHook")
+def test_sensor_with_pk_and_sk(self, ddb_mock):
+pprint(ddb_mock)
+
+hook = DynamoDBHook(
+aws_conn_id=AWS_CONN_ID, table_name=TABLE_NAME, table_keys=["PK"], 
region_name=REGION_NAME
+)
+
+hook.get_conn().create_table(
+TableName=TABLE_NAME,
+KeySchema=[
+{"AttributeName": "PK", "KeyType": "HASH"},
+],
+AttributeDefinitions=[{"AttributeName": "PK", "AttributeType": 
"S"}],
+ProvisionedThroughput={"ReadCapacityUnits": 10, 
"WriteCapacityUnits": 10},
+)
+
+table = hook.get_conn().Table(TABLE_NAME)
+table.meta.client.get_waiter("table_exists").wait(TableName=TABLE_NAME)
+
+assert table.table_status == "ACTIVE"
+
+sensor = DynamoDBValueSensor(
+task_id=TASK_ID,
+poke_interval=30,
+timeout=120,
+soft_fail=False,
+retries=10,
+table_name=TABLE_NAME,  # replace with your table name
+partition_key_name="PK",  # replace with your partition key name
+partition_key_value="Test",  # replace with your partition key 
value
+sort_key_name="SK",  # replace with your sort key name (if 
applicable)
+sort_key_value="2023-03-28T11:11:25-0400",  # replace with your 
sort key value (if applicable)
+attribute_name="Foo",  # replace with the attribute name to wait 
for
+attribute_value="Bar",  # replace with the attribute value to wait 
for (sensor will return true when this value matches the attribute value in the 
item)
+)
+

Review Comment:
   Looking at other sensor tests (EC2 

[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-03-29 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1152363711


##
tests/providers/amazon/aws/sensors/test_dynamodb.py:
##
@@ -0,0 +1,129 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+from unittest.mock import MagicMock
+from pprint import pprint
+
+import boto3
+import pytest
+from moto import mock_dynamodb
+
+# from airflow.exceptions import AirflowException
+from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
+from airflow.providers.amazon.aws.sensors.dynamodb import DynamoDBValueSensor
+
+AWS_CONN_ID = "aws_default"
+REGION_NAME = "us-east-1"
+TABLE_NAME = "test_airflow"
+TASK_ID = "dynamodb_value_sensor"
+
+# breeze testing tests ./tests/providers/amazon/aws/sensors/test_dynamodb.py 
--log-cli-level=INFO
+
+

Review Comment:
   Hm.  Alright.   I usually use the client rather than the resource, so I 
wasn't positive if they were interchangeable in this case or not.  Looks like 
that wasn't the issue.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-03-29 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1152231454


##
tests/providers/amazon/aws/sensors/test_dynamodb.py:
##
@@ -0,0 +1,129 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+from unittest.mock import MagicMock
+from pprint import pprint
+
+import boto3
+import pytest
+from moto import mock_dynamodb
+
+# from airflow.exceptions import AirflowException
+from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
+from airflow.providers.amazon.aws.sensors.dynamodb import DynamoDBValueSensor
+
+AWS_CONN_ID = "aws_default"
+REGION_NAME = "us-east-1"
+TABLE_NAME = "test_airflow"
+TASK_ID = "dynamodb_value_sensor"
+
+# breeze testing tests ./tests/providers/amazon/aws/sensors/test_dynamodb.py 
--log-cli-level=INFO
+
+
+@pytest.fixture
+def use_moto():
+@mock_dynamodb
+def dynamodb_client():
+dynamodb = boto3.resource("dynamodb", region_name=REGION_NAME)
+
+dynamodb.create_table(
+TableName=TABLE_NAME,
+KeySchema=[
+{"AttributeName": "PK", "KeyType": "HASH"},
+{"AttributeName": "SK", "KeyType": "RANGE"},
+],
+AttributeDefinitions=[
+{"AttributeName": "PK", "AttributeType": "S"},
+{"AttributeName": "SK", "AttributeType": "S"},
+],
+BillingMode="PAY_PER_REQUEST",
+)
+return dynamodb
+
+return dynamodb_client
+
+
+class TestDynamoDBValueSensor:
+def setup_method(self):
+self.mock_context = MagicMock()
+
+def test_init(self):
+sensor = DynamoDBValueSensor(
+task_id=TASK_ID,
+table_name=TABLE_NAME,
+partition_key_name="PK",
+partition_key_value="Test",
+sort_key_name="SK",
+sort_key_value="2022-07-12T11:11:25-0400",
+attribute_name="Foo",
+attribute_value="Bar",
+aws_conn_id=AWS_CONN_ID,
+region_name=REGION_NAME,
+)
+
+assert TASK_ID == sensor.task_id
+assert AWS_CONN_ID == sensor.aws_conn_id
+assert REGION_NAME == sensor.region_name
+
+@mock_dynamodb
+def test_get_conn_returns_a_boto3_connection(self):
+hook = DynamoDBHook(aws_conn_id=AWS_CONN_ID)
+assert hook.get_conn() is not None

Review Comment:
   Here and below, please use `hook.conn` instead of `hook.get_conn`.  They are 
functionally identical but we are trying to work towards standardizing the code 
when possible.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-03-29 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1152231454


##
tests/providers/amazon/aws/sensors/test_dynamodb.py:
##
@@ -0,0 +1,129 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+from unittest.mock import MagicMock
+from pprint import pprint
+
+import boto3
+import pytest
+from moto import mock_dynamodb
+
+# from airflow.exceptions import AirflowException
+from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
+from airflow.providers.amazon.aws.sensors.dynamodb import DynamoDBValueSensor
+
+AWS_CONN_ID = "aws_default"
+REGION_NAME = "us-east-1"
+TABLE_NAME = "test_airflow"
+TASK_ID = "dynamodb_value_sensor"
+
+# breeze testing tests ./tests/providers/amazon/aws/sensors/test_dynamodb.py 
--log-cli-level=INFO
+
+
+@pytest.fixture
+def use_moto():
+@mock_dynamodb
+def dynamodb_client():
+dynamodb = boto3.resource("dynamodb", region_name=REGION_NAME)
+
+dynamodb.create_table(
+TableName=TABLE_NAME,
+KeySchema=[
+{"AttributeName": "PK", "KeyType": "HASH"},
+{"AttributeName": "SK", "KeyType": "RANGE"},
+],
+AttributeDefinitions=[
+{"AttributeName": "PK", "AttributeType": "S"},
+{"AttributeName": "SK", "AttributeType": "S"},
+],
+BillingMode="PAY_PER_REQUEST",
+)
+return dynamodb
+
+return dynamodb_client
+
+
+class TestDynamoDBValueSensor:
+def setup_method(self):
+self.mock_context = MagicMock()
+
+def test_init(self):
+sensor = DynamoDBValueSensor(
+task_id=TASK_ID,
+table_name=TABLE_NAME,
+partition_key_name="PK",
+partition_key_value="Test",
+sort_key_name="SK",
+sort_key_value="2022-07-12T11:11:25-0400",
+attribute_name="Foo",
+attribute_value="Bar",
+aws_conn_id=AWS_CONN_ID,
+region_name=REGION_NAME,
+)
+
+assert TASK_ID == sensor.task_id
+assert AWS_CONN_ID == sensor.aws_conn_id
+assert REGION_NAME == sensor.region_name
+
+@mock_dynamodb
+def test_get_conn_returns_a_boto3_connection(self):
+hook = DynamoDBHook(aws_conn_id=AWS_CONN_ID)
+assert hook.get_conn() is not None

Review Comment:
   Here and below, please use `hook.conn` instead of hook.get_conn`.  They are 
functionally identical but we are trying to work towards standardizing the code 
when possible.



##
tests/providers/amazon/aws/sensors/test_dynamodb.py:
##
@@ -0,0 +1,129 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+from unittest.mock import MagicMock
+from pprint import pprint
+
+import boto3
+import pytest
+from moto import mock_dynamodb
+
+# from airflow.exceptions import AirflowException
+from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
+from airflow.providers.amazon.aws.sensors.dynamodb import DynamoDBValueSensor
+
+AWS_CONN_ID = "aws_default"
+REGION_NAME = "us-east-1"
+TABLE_NAME = "test_airflow"
+TASK_ID = "dynamodb_value_sensor"
+
+# breeze testing tests ./tests/providers/amazon/aws/sensors/test_dynamodb.py 
--log-cli-level=INFO
+
+

Review Comment:
   I don't see where this fixture is being used anywhere?



##

[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-01-30 Thread via GitHub


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1090962387


##
airflow/providers/amazon/aws/sensors/dynamodb.py:
##
@@ -0,0 +1,83 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from typing import TYPE_CHECKING, Any, Optional
+
+from airflow.compat.functools import cached_property
+from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
+from airflow.sensors.base import BaseSensorOperator
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class DynamoDBValueSensor(BaseSensorOperator):
+"""
+Waits for an attribute value to be present for an item in a DynamoDB table.
+
+:param partition_key_name: DynamoDB partition key name
+:param partition_key_value: DynamoDB partition key value
+:param attribute_name: DynamoDB attribute name
+:param attribute_value: DynamoDB attribute value
+:param sort_key_name: (optional) DynamoDB sort key name
+:param sort_key_value: (optional) DynamoDB sort key value
+"""
+
+def __init__(
+self,
+table_name: str,
+partition_key_name: str,
+partition_key_value: str,
+attribute_name: str,
+attribute_value: str,
+sort_key_name: Optional[str] = None,
+sort_key_value: Optional[str] = None,
+aws_conn_id: str | None = DynamoDBHook.default_conn_name,
+**kwargs: Any,
+):
+super().__init__(**kwargs)
+self.table_name = table_name
+self.partition_key_name = partition_key_name
+self.partition_key_value = partition_key_value
+self.attribute_name = attribute_name
+self.attribute_value = attribute_value
+self.sort_key_name = sort_key_name
+self.sort_key_value = sort_key_value
+self.aws_conn_id = aws_conn_id
+
+def poke(self, context: Context) -> bool:
+"""Test DynamoDB item for matching attribute value"""
+key = {self.partition_key_name: self.partition_key_value}
+msg = f"Checking table {self.table_name} for item Partition Key: 
{self.partition_key_name}={self.partition_key_value}"
+
+if self.sort_key_value:
+key[self.sort_key_name] = self.sort_key_value
+msg += f" Sort Key: {self.sort_key_name}={self.sort_key_value}"
+
+msg += f" attribute: {self.attribute_name}={self.attribute_value}"
+

Review Comment:
   Tiny nitpick idea, this is going to result is a pretty long string, maybe 
consider formatting it a little with \n newlines?



##
airflow/providers/amazon/aws/sensors/dynamodb.py:
##
@@ -0,0 +1,83 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from typing import TYPE_CHECKING, Any, Optional
+
+from airflow.compat.functools import cached_property
+from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
+from airflow.sensors.base import BaseSensorOperator
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class DynamoDBValueSensor(BaseSensorOperator):
+"""
+Waits for an attribute value to be present for an item in a DynamoDB table.
+
+:param partition_key_name: DynamoDB partition key name
+:param partition_key_value: DynamoDB partition key value
+:param attribute_name: DynamoDB attribute name
+:param attribute_value: DynamoDB attribute value
+:param sort_key_name: (optional) DynamoDB sort key 

[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2022-12-13 Thread GitBox


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1047829104


##
airflow/providers/amazon/aws/example_dags/example_dynamodb_sensor.py:
##
@@ -0,0 +1,44 @@
+# Licensed to the Apache Software Foundation (ASF) under one

Review Comment:
   I actually noticed that after I made the comment and took a peek where we 
could squeeze it into the existing test and... we don't have one.  Sorry, I got 
distracted and forgot to come back and sort that out.
   
   I'd say it can be added into the existing [Dynamo-to-S3 system 
test](https://github.com/apache/airflow/blob/main/tests/system/providers/amazon/aws/example_dynamodb_to_s3.py).
  Drop the new sensor right after the create_table task perhaps? 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2022-12-13 Thread GitBox


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1047829104


##
airflow/providers/amazon/aws/example_dags/example_dynamodb_sensor.py:
##
@@ -0,0 +1,44 @@
+# Licensed to the Apache Software Foundation (ASF) under one

Review Comment:
   I actually noticed that after I made the comment and took a peek where we 
could squeeze it into the existing test and... we don't have one.  Sorry, I got 
distracted and forgot to come back and sort that out.
   
   I'd say it can be added into the existing [Dynamo-to-S3 system 
test](https://github.com/apache/airflow/blob/main/tests/system/providers/amazon/aws/example_dynamodb_to_s3.py).
  Drop the new sensor right after the table_setup perhaps? 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ferruzzi commented on a diff in pull request #28338: New AWS sensor — DynamoDBValueSensor

2022-12-13 Thread GitBox


ferruzzi commented on code in PR #28338:
URL: https://github.com/apache/airflow/pull/28338#discussion_r1047588453


##
airflow/providers/amazon/aws/example_dags/example_dynamodb_sensor.py:
##
@@ -0,0 +1,44 @@
+# Licensed to the Apache Software Foundation (ASF) under one

Review Comment:
   Example DAGs are being dropped and have all been moved into system tests 
[here](https://github.com/apache/airflow/tree/main/tests/system/providers/amazon/aws).
  Please either update the DynamoDB system test with this new sensor if 
possible or create a new system test there instead of this Example DAG.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org