0
votes

Have some employee segmentation tasks that result in a large number of records (about 2000) that needs to be inserted into SQL Azure. The records themselves are very small about 4 integers. A Azure worker role performs the segmentation task and inserts the resultant rows to a SQL Azure table. There might be multiple such tasks (each with about 1000 - 2000 rows) in queue and hence each of these inserts will need to be performed pretty fast.

Timing tests using a local machine to SQL Azure took significant time (approximately 2 minutes for the 1000 inserts). This might be caused due to the network latency. I am assuming the inserts from the worker role should be much faster.

However, since entity framework does not to batch inserts well we were thinking about using SQLBulkCopy. Would using SQLBulkcopy result in the queries being throttled if the batch size is say a 1000? Is there any recommended approach?

1
Some of these may not apply to SQL Azure specifically, but they may be useful reading nonetheless: dba.stackexchange.com/questions/16763/… stackoverflow.com/questions/8635818/…Aaron Bertrand

1 Answers

2
votes

The Bulk Copy API should serve your purposes perfectly and result in very dramatic performance improvements.

I have tested inserting 10 million records with a batch size of 2000 into an Azure database and no throttling has occured with performance of ~10 seconds per batch when running from my local machine.