2
votes

I'm using the batchWriteItem API for DynamoDB.

I wrote a wrapper on top of the PHP AWS SDK2, and it is working without a problem.

I have my code for splitting up batchWriteItem requests to 25 items, and also for retrying items in the UnprocessedItems key of the response. Right now I am using this to migrate a large database into DynamoDB, but something weird is happening...

In short, the batchWriteItem is only processing 1 item at a time, and returning the rest in UnprocessedItems.

In detail, this is what is happening:

  1. I send a request to put 25 items via batchWriteItem.
  2. All 25 items come back in UnprocessedItems.
  3. My retry code runs, and it resends the 25 items.
  4. 24 items come back in UnprocessedItems.
  5. My retry code runs, and it resends the 24 items.
  6. 23 items come back in UnprocessedItems.
  7. repeat the above until 0 items are returned.

I set my Write Capacity Units to 8, and the ConsumedWriteCapacityUnits my CloudWatch metrics shows me that it is currently running at somewhere between 1 ~ 1.5 .

Does anyone know why this is happening?

I am confirming that the items are actually being put into the DB, but still, there is really no meaning of the batchWriteItem if all it is doing is processing items one by one...

====== UPDATE ======

I was able to find out that the value for batchWriteItem in Throttled Requests is skyrocketing.

Does this mean that the only solution is to boost up the Write Capacity Units so that it can handle the throttled requests?

1

1 Answers

1
votes

Yes, it sounds like you need to increase your Write Capacity Units. For details about how throughput works, you should ask questions on the Amazon DynamoDB forum and consult the Amazon DynamoDB developer guide.

Also, there is already a wrapper over batchWriteItem built into the SDK called the WriteRequestBatch class.