0
votes

I get an odd error while batching data to Azure table storage.

I have an array with +350.000 strings. I save each string in a row. It Works fine until the firste +50.000 records then the Azure table storage starts to throw an exception with "invalid inputtype" and a "statuscode 400".

When I batch, I batch 10 items at a time, with a simple retrypolicy.

_TableContext.RetryPolicy = RetryPolicies.Retry(4, new TimeSpan(0, 0, 30));
_TableContext.SaveChanges(System.Data.Services.Client.SaveChangesOptions.Batch);

No async, no parallism. It Works fine on Dev environment.

Grrr...

2
Fiddler is your friend. There's nowhere near enough information here for anyone to guess at the answer, but capturing the HTTP request and response for the call that failed would almost certainly give us enough to go on.user94559

2 Answers

3
votes

There is a physical limit in the Azure Table Storage of 1MB per row, and a limit of 64 Kb (Kilobytes) per string field.

Also, if you are storing the strings as partitionkeys or rowkeys then some characters are not allowed.

Source: http://msdn.microsoft.com/en-us/library/dd179338.aspx

1
votes

the error was my own mistake. I had made an attempt to save batches with the same set of row and partionkey. When I changed that, it worked perfectly.

Azure FTW! :)