2
votes

I use google translate API with C# code via "Google.Apis.Translate.v2" version 1.9.2.410 with paid service.

Code is some like:

var GoogleService = new Google.Apis.Translate.v2.TranslateService(
 new BaseClientService.Initializer
{
    ApiKey = Context.ConfigData.GoogleApiKey,
    ApplicationName = "Translator"
});
...

  var rqr = GoogleService.Translations.List(item, 'de');
  rqr.Source = "cs";

  var result = await rqr.ExecuteAsync();

This code take Exception:

User Rate Limit Exceeded [403] Errors [ Message[User Rate Limit Exceeded] Location[ - ] Reason[userRateLimitExceeded] Domain[usageLimits] ]

Before that, it never was. My limit it's: Total quota 50 000 000 characters/day Remaining
49 344 849 characters/day 98,69 % of total Per-user limit
100 requests/second/user

The number of requests is certainly less than 100 request per second Please what's wrong?

1

1 Answers

2
votes

There is an existing undocumented quota for Translate API. This quota limits the number of characters per 100 seconds per user to 10,000 (aka 10,000 chars/100seconds/user).

This means that, even if you’re splitting large texts into different requests, you won’t be able to bypass 10,000 characters within a 100-seconds interval.

Brief examples:

  • If you bypass 10k characters within the first 5 seconds, you will need to wait 95 seconds to continue analyzing chars.
  • If you hit this quota after 50 seconds, you will need to wait another 50.
  • If you hit it on the second 99th, you will need to wait 1 second to continue the work.

What I would recommend is to always catch exceptions, and retry a number of times doing an exponential backoff. The idea is that if the server is down temporarily due to hitting the 100-seconds interval quota, it is not overwhelmed with requests hitting at the same time until it comes back up (and therefore returning 403 errors continuously). You can see a brief explanation of this practice here (the sample is focused on Drive API, but the same concepts apply to every cloud-based service).

Alternatively, you could catch exceptions, and whenever you get a 403 error, apply a delay of 100 seconds and retry again. This won't be the most time-efficient solution, as the 100-seconds intervals are continuous (not started when the quota is reached), but it will assure that you don’t hit the limit twice with the same request.