[GitHub] [airflow] o-nikolas commented on pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-04-14 Thread via GitHub


o-nikolas commented on PR #28338:
URL: https://github.com/apache/airflow/pull/28338#issuecomment-1508900573

   congrats on the first contribution @mrichman!!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] o-nikolas commented on pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-04-13 Thread via GitHub


o-nikolas commented on PR #28338:
URL: https://github.com/apache/airflow/pull/28338#issuecomment-1507883671

   Just approved another build on the latest changes! :crossed_fingers: 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] o-nikolas commented on pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-03-30 Thread via GitHub


o-nikolas commented on PR #28338:
URL: https://github.com/apache/airflow/pull/28338#issuecomment-1490954958

   > Is there a way to patch the call to `get_item()` in my sensor so that it 
returns a canned response?
   
   Absolutely, you could even just mock the conn to return a Mocked Table 
object and then set the return value you want on table.get_item() to be 
whatever you like.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] o-nikolas commented on pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-03-30 Thread via GitHub


o-nikolas commented on PR #28338:
URL: https://github.com/apache/airflow/pull/28338#issuecomment-1490667768

   > > > It's failing on the last line of the test `assert sensor.poke(None)`. 
What am I doing wrong here?
   > > 
   > > 
   > > Can you include some more context (traceback, logs, etc) for the failure 
you're seeing?
   > 
   > Here's the output of `breeze testing tests 
./tests/providers/amazon/aws/sensors/test_dynamodb.py`
   > 
   > ```
   > = test session starts 
==
   > platform linux -- Python 3.10.10, pytest-7.2.2, pluggy-1.0.0
   > rootdir: /opt/airflow, configfile: pytest.ini
   > plugins: asyncio-0.21.0, httpx-0.21.3, time-machine-2.9.0, 
instafail-0.4.2, timeouts-1.2.1, requests-mock-1.10.0, anyio-3.6.2, 
xdist-3.2.1, cov-4.0.0, rerunfailures-11.1.2, capture-warnings-0.0.4
   > asyncio: mode=strict
   > setup timeout: 60.0s, execution timeout: 60.0s, teardown timeout: 60.0s
   > collected 3 items
   > 
   > tests/providers/amazon/aws/sensors/test_dynamodb.py ..F  
[100%]
   > 
   > === FAILURES 
===
   > __ TestDynamoDBValueSensor.test_sensor_with_pk_and_sk 
__
   > 
   > self = 

   > ddb_mock = 
   > 
   > @mock_dynamodb
   > 
@mock.patch("airflow.providers.amazon.aws.sensors.dynamodb.DynamoDBHook")
   > def test_sensor_with_pk_and_sk(self, ddb_mock):
   > 
   > hook = DynamoDBHook(
   > aws_conn_id=AWS_CONN_ID, table_name=TABLE_NAME, 
table_keys=["PK"], region_name=REGION_NAME
   > )
   > 
   > hook.conn.create_table(
   > TableName=TABLE_NAME,
   > KeySchema=[
   > {"AttributeName": "PK", "KeyType": "HASH"},
   > {"AttributeName": "SK", "KeyType": "RANGE"},
   > ],
   > AttributeDefinitions=[
   > {"AttributeName": "PK", "AttributeType": "S"},
   > {"AttributeName": "SK", "AttributeType": "S"},
   > ],
   > BillingMode="PAY_PER_REQUEST",
   > )
   > 
   > table = hook.conn.Table(TABLE_NAME)
   > 
table.meta.client.get_waiter("table_exists").wait(TableName=TABLE_NAME)
   > 
   > assert table.table_status == "ACTIVE"
   > 
   > sensor = DynamoDBValueSensor(
   > task_id=TASK_ID,
   > poke_interval=30,
   > timeout=120,
   > soft_fail=False,
   > retries=10,
   > table_name=TABLE_NAME,  # replace with your table name
   > partition_key_name="PK",  # replace with your partition key 
name
   > partition_key_value="Test",  # replace with your partition key 
value
   > sort_key_name="SK",  # replace with your sort key name (if 
applicable)
   > sort_key_value="2023-03-28T11:11:25-0400",  # replace with 
your sort key value (if applicable)
   > attribute_name="Foo",  # replace with the attribute name to 
wait for
   > attribute_value="Bar",  # replace with the attribute value to 
wait for (sensor will return true when this value matches the attribute value 
in the item)
   > )
   > 
   > assert not sensor.poke(None)
   > 
   > table.put_item(Item={"PK": "123", "SK": 
"2023-03-28T11:11:25-0400", "Foo": "Bar"})
   > 
   > >   assert sensor.poke(None)
   > E   assert False
   > E+  where False = >(None)
   > E+where > = 
.poke
   > 
   > tests/providers/amazon/aws/sensors/test_dynamodb.py:104: AssertionError
   > - Captured stderr call 
-
   > INFO  [airflow.hooks.base] Using connection ID 'aws_default' for task 
execution.
   > INFO  [botocore.credentials] Found credentials in environment variables.
   > INFO  [airflow.task.operators] Checking table test_airflow foritem 
Partition Key: PK=Test
   > Sort Key: SK=2023-03-28T11:11:25-0400
   > attribute: Foo=Bar
   > INFO  [airflow.task.operators] Response: 
   > INFO  [airflow.task.operators] Checking table test_airflow foritem 
Partition Key: PK=Test
   > Sort Key: SK=2023-03-28T11:11:25-0400
   > attribute: Foo=Bar
   > INFO  [airflow.task.operators] Response: 
   > -- Captured log call 
---
   > INFO airflow.hooks.base:base.py:73 Using connection ID 'aws_default' 
for task execution.
   > INFO botocore.credentials:credentials.py:1124 Found credentials in 
environment variables.
   > INFO airflow.task.operators:dynamodb.py:79 Checking table test_airflow 
foritem Partition Key: PK=Test
   > Sort Key: SK=2023-03-28T11:11:25-0400
   > attribute: Foo=Bar
   > INFO airflow.task.operators:dynamodb.py:82 Response: 
   > INFO airflow.task.operators:dynamodb.py:79 Checking table test_airflow 
foritem Partition Key: PK=Test
   > Sort Key: SK=2023-03-2

[GitHub] [airflow] o-nikolas commented on pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-03-29 Thread via GitHub


o-nikolas commented on PR #28338:
URL: https://github.com/apache/airflow/pull/28338#issuecomment-1489353144

   > It's failing on the last line of the test `assert sensor.poke(None)`. What 
am I doing wrong here?
   
   Can you include some more context (traceback, logs, etc) for the failure 
you're seeing?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] o-nikolas commented on pull request #28338: New AWS sensor — DynamoDBValueSensor

2023-01-24 Thread via GitHub


o-nikolas commented on PR #28338:
URL: https://github.com/apache/airflow/pull/28338#issuecomment-1403128044

   Hey @mrichman,
   
   Any plans to make more progress on this one?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org