7
votes

I'm building a web application on top of the Google Drive API. Basically, the web application displays photos and videos. The media is stored in a Google Drive folder: Once authenticated, the application makes requests to the Google Drive API to get an URL for the media and displays each one. For the moment, I have only 16 images to display. These images are hard-written in the application (for the demo).

I have encountered an issue with my application accessing Google Drive API. Indeed, after multiple tries, I've got this error for random requests

User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXX"

So I looked the API console and saw nothing special, I don't exceed the rate limit according to me. Maybe I use wrong the google API, I don't know in fact...

the bar on the right is my last try: 32 queries

I followed the Google Drive API documentation to check whether I did something wrong. For each API request, the request contains the access token, so it should work correctly !

A demonstration of the app is available: https://poc-drive-api.firebaseapp.com

The source code is also available: https://github.com/Mcdostone/poc-google-drive-api (file App.js)

3
see stackoverflow.com/questions/18578768/… and stackoverflow.com/questions/18529524/… . You need to implement throttling. NB NOT exponential backoffpinoyyid
In my case uploading 50 -> 200 files at the same time using curl multiple. I bought 10 VPS just for proxied the request to avoid Rate Limit but it never gone. Probably using multiple account could solve the problem.John Doe

3 Answers

6
votes

403: User Rate Limit Exceeded is flood protection. A user can only make so many requests at a time. unfortunately user rate limit is not shown in the graph you are looking at. That graph is actually really bad at showing what is truly happening. Google tests in the background and kicks out the error if you are exceeding your limit. They are not required to actually show us that in the graph

403: User Rate Limit Exceeded

The per-user limit has been reached. This may be the limit from the Developer Console or a limit from the Drive backend.

{ "error": { "errors": [ { "domain": "usageLimits", "reason": "userRateLimitExceeded", "message": "User Rate Limit Exceeded" } ], "code": 403, "message": "User Rate Limit Exceeded" } }

Suggested actions:

  • Raise the per-user quota in the Developer Console project.
  • If one user is making a lot of requests on behalf of many users of a G Suite domain, consider a Service Account with authority delegation (setting the quotaUser parameter).
  • Use exponential backoff.

IMO the main thing to do when you begin to encounter this error message is to implement exponential backoff this way your application will be able to slow down and make the request again.

3
votes

In my case, I was recursing through Google Drive folders in parallel and getting this error. I solved the problem by implementing client-side rate limiting using the Bottleneck library with a 110ms delay between requests:

const limiter = new Bottleneck({
    // Google allows 1000 requests per 100 seconds per user,
    // which is 100ms per request on average. Adding a delay
    // of 100ms still triggers "rate limit exceeded" errors,
    // so going with 110ms.
    minTime: 110,
});

// Wrap every API request with the rate limiter
await limiter.schedule(() => drive.files.list({
    // Params...
}));
3
votes

I was using the limiter library to enforce the "1000 queries per 100 seconds" limit, but I was still getting 403 errors. I finally stumbled upon this page where it mentions that:

In the API Console, there is a similar quota referred to as Requests per 100 seconds per user. By default, it is set to 100 requests per 100 seconds per user and can be adjusted to a maximum value of 1,000. But the number of requests to the API is restricted to a maximum of 10 requests per second per user.

So I updated the limiter library to only allow 10 requests every second instead of 1,000 every 100 seconds and it worked like a charm.

const RateLimiter = require('limiter').RateLimiter;

const limiter = new RateLimiter(10, 1000);