3
votes

I am using AWS DynamoDB in order to store information. I have two machines running separate codes, that accessing the information in the database.

One of the machines is writing into the database and the second one is reading. Since the second one does not know whether or not the information in the database has been changed I need to somehow monitor my database for changes. I know that there is something called dynamo streams that can provide you with the information regarding changes made in your database and I already have that code implemented.

The question is as follows: if I am monitoring the database constantly, I need to query this stream all the time, let's say once every minute. What is the difference between doing that and actually querying the database every minute? Is it much more efficient? Is it less costly (resources, moneywise)? Is there any other, more efficient way of monitoring changes in the database in a specific table from the code?

any help would be appreciated, thank you.

1

1 Answers

1
votes

Most people I have seen do something like this do it with DynamoDB Streams + Lambda for best results. Definitely check out the DynamoDB docs and the Lambda docs on this topic.

There's also an example in the docs of monitoring DynamoDB where changes fire off a message to an SNS topic.

DynamoDB Streams is more efficient and near real time. Think of using Lambda in this way like you would a trigger in a relational database. Why do the extra effort, when the patterns are this very well defined and people use them all the time?