I'm using the batchWriteItem API for DynamoDB.
I wrote a wrapper on top of the PHP AWS SDK2, and it is working without a problem.
I have my code for splitting up batchWriteItem requests to 25 items, and also for retrying items in the UnprocessedItems
key of the response.
Right now I am using this to migrate a large database into DynamoDB, but something weird is happening...
In short, the batchWriteItem is only processing 1 item at a time, and returning the rest in UnprocessedItems
.
In detail, this is what is happening:
- I send a request to put 25 items via
batchWriteItem
. - All 25 items come back in
UnprocessedItems
. - My retry code runs, and it resends the 25 items.
- 24 items come back in
UnprocessedItems
. - My retry code runs, and it resends the 24 items.
- 23 items come back in
UnprocessedItems
. - repeat the above until 0 items are returned.
I set my Write Capacity Units to 8, and the ConsumedWriteCapacityUnits
my CloudWatch metrics shows me that it is currently running at somewhere between 1 ~ 1.5 .
Does anyone know why this is happening?
I am confirming that the items are actually being put into the DB, but still, there is really no meaning of the batchWriteItem if all it is doing is processing items one by one...
====== UPDATE ======
I was able to find out that the value for batchWriteItem
in Throttled Requests
is skyrocketing.
Does this mean that the only solution is to boost up the Write Capacity Units so that it can handle the throttled requests?