Hi flink users,

We have a problem and think flink may be a good solution for that. But I'm
new to flink and hope can get some insights from flink community :)

Here is the problem. Suppose we have a DynamoDB table which store the
inventory data, the schema is like:

* vendorId (primary key)
* inventory name
* inventory units
* inbound time
...

This DDB table keeps changing, since we have inventory coming and
removal. *Every
change will trigger a DynamoDB stream. *
We need to calculate *all the inventory units that > 15 days for a specific
vendor* like this:
> select vendorId, sum(inventory units)
> from dynamodb
> where today's time - inbound time > 15
> group by vendorId
We don't want to schedule a daily batch job, so we are trying to work on a
micro-batch solution in Flink, and publish this data to another DynamoDB
table.

A draft idea is to use the total units minus <15 days units, since both of
then have event trigger. But no detailed solutions yet.

Could anyone help provide some insights here?

Thanks,
J.

Reply via email to