2
votes

i am using dynamodb service for data insertion. But its working randomly. sometimes it insert values and most of the times it is skipping. Although i am sending different primary key all times. Following code i am using. Please advice. Thank you

Dictionary<string, AttributeValue> attributes = new Dictionary<string, AttributeValue>();
foreach (KeyValuePair<string, object> entry in paramDictioonary)
{
    if (entry.Value == "")
    {
        attributes[entry.Key.ToString()] = new AttributeValue { S = "Empty Value" };
    }
    else
        attributes[entry.Key.ToString()] = new AttributeValue { S = entry.Value.ToString() };
}

AmazonDynamoDBClient client = new AmazonDynamoDBClient();
{
    PutItemRequest request = new PutItemRequest
    {
        TableName = "tableNamehere",
        Item = attributes
    };
    client.PutItem(request);
}

Please help. Thanks in advance

Kind Regards.

2
are you getting any throughput exceeded exceptions ?TheWhiteRabbit
No.. i have applied try catch bt get no exception in any case.. it seems code works fine bt yet sometimes it insert and most of times id dont without leaving any error traceuser2678516
Can you clarify what exactly is being skipped? You may also be seeing eventual consistency issues. If using the GetItem operation, try setting GetItemRequest.ConsistentRead to true.Pavel Safronov
Thanks Pavel. let me clarify... whole rows are being skipped. i am using this function on each page of website to track user activity and some other attributes. function is called when we navigate through any page. but it is not inserting data for every time its called. but randomly inserting data lets say for page 1.. then after 5min for page 2 etc.. bt not for all pages. i am new to use this dynamodb so not sure what to do.user2678516
I thought I had this problem but found that I just needed to click on the top right of the table to see the next page worth of items. I didn't realize that sorting in DDB only sorts things on the page you're on, not the whole table.seeiespi

2 Answers

2
votes

We have been fighting with this problem for the last 48 hours until we finally re-read the description of the Put operation.

We had created a time based key and had 6 instances inserting 3-4 records per second. The result we saw was for 1200 records inserted only 600-700 made it into dynamo db and cloud search.

What we realised was, and maybe it's also effecting you, is that the Put operation will over write records with the same key without returning an exception. It therefore looked in our case that that Dynamo DB was dropping records on insert where in reality we must have been creating duplicate keys and therefore records were over writing each other.

I hope this helps.

0
votes

What you're describing shouldn't happen; If you are looking at a table very quickly after data is inserted (less than a second) you might not see it, because Dynamo allows inconsistent reads. If you're not seeing data after minutes (or ever), then either your PUTs are not successful, or Dynamo is having problems.

To prove that your bug is really happening, you can look at wire logs of the DynamoDB client (I'm not sure how to enable this in C#, I'm a Java guy) and find a request that you PUT to Dynamo, and then try to read it minutes later and confirm that you can't. If you take the RequestId that AmazonAWS provides as a response on both of these requests (the PUT that put the data and the GET that gets the data), you can give these to AmazonAWS and have them look into it.

However my guess is that if you go through the work to get this logging working and look into it, you might find a bug where you aren't successfully storing the data.