0
votes

I have an Azure Table which stores 1000s of discount codes partitioned by the first letter of the code so there are roughly 30 partitions with 1000 records each. In my application I enter a code and get the specific record from the table. I then update the discount code to say that it's been used. When load testing this application with 1000 concurrent users for 30 seconds the response times for reading the codes takes less than 1 second but updating the record takes over 10 seconds. Is this typical behavior for table storage or is there a way to speed this up?

//update discount code

string code = "A0099"; 

CloudStorageAccount storageAccount = CloudStorageAccount.Parse("constring...");

CloudTableClient tableClient = storageAccount.CreateCloudTableClient();

CloudTable table = tableClient.GetTableReference("discounts");            

string partitionKey = code[0].ToString().ToUpper();

TableOperation retrieveOperation = TableOperation.Retrieve<DiscountEntity>(partitionKey, code);

TableResult retrievedResult = table.Execute(retrieveOperation);

if (retrievedResult.Result != null) {

    DiscountEntity discount = (DiscountEntity)retrievedResult.Result;

    discount.Used = true;

    TableOperation updateOperation = TableOperation.Replace(discount);

    table.Execute(updateOperation);

}
2

2 Answers

0
votes

This is not the default behavior but i've seen it before... first of all check your vm size, because the bigger the vm size, faster the I/O (theres a MS doc somewhere that says that fat VMs have "fast I/O" or something like that...) but 10 secs is alot even for the extra-small vm...

To speed things up, i would suggest you to:

  • implement cache!, instead of searching for 1 code at a time, capture the whole "letter" of unused codes at once, cache them up, and then search the cache for the guy to update
  • Dont live update, instead, update the cache and than use the async methods to save things back
0
votes

One thing you can check is the E2E time for a specific request vs how much time server has spent processing the request. That would allow you to see whether the bottleneck is the client/network or the server.

For more information on enabling Windows Azure Storage Analytics (specifically Logging), please refer to How To Monitor a Storage Account and Storage Analytics articles.