6
votes

We are trying to move some data from one of our blob storage accounts and we are getting throttled.

Initially, we were getting 9gbps but soon after we got throttled down to 1.1gbps.

We also started receiving errors saying that Azure forcibly closed the connection and we were getting network timeouts.

Has anyone experienced this or have any knowledge around increasing limits?

1

1 Answers

5
votes

According to the offical document Storage limits of Azure subscription and service limits, quotas, and constraints, there are some limits about your scenario which can not around as below.

  1. Maximum request rate1 per storage account: 20,000 requests per second
  2. Max egress:
    • for general-purpose v2 and Blob storage accounts (all regions): 50 Gbps
    • for general-purpose v1 storage accounts (US regions): 20 Gbps if RA-GRS/GRS enabled, 30 Gbps for LRS/ZRS 2
    • for general-purpose v1 storage accounts (Non-US regions): 10 Gbps if RA-GRS/GRS enabled, 15 Gbps for LRS/ZRS 2
  3. Target throughput for single blob: Up to 60 MiB per second, or up to 500 requests per second

Considering for download data to local environment, except your network bandwidth and stablity, you have to compute the max concurrent number of requests per blob not over 500 and the total number of all requests not over 20,000 if you want to move data programmatically. So it's the key point for high concurrency controll.

If just move data inside Azure or not by programming, the best way is to use the offical transfer data tool AzCopy(for Windows or Linux) and Azure Data Factory. Then you will not need to consider for these limits and just wait for the move progress done.

Any concern, please feel free to let me know.