My data model is the following: Timestamp, value
This is currently being stored in a csv file in s3, where the client downloads and uses it to append value onto some data, with the model: Timestamp, name
the final file is the Timestamp, name, value
Would it be faster to store all of the data model in a dynamodb table and lookup each one via timestamp?
My problem was that we would be looking up for 100-20k records and I am not sure how dynamo would handle this. If the whole file is on the client side, which is about 3MB big, then it can be done locally.
This file is 3MB now, however it will grow over time, via a scheduled lambda function. I do not care about strong writes that much, just need it to be eventually consistent.
Also, if data was in dynamo, then I could append value in lambda instead of doing it on client side. If dynamo is too slow, this could timeout lambda however.