I use the Google API PHP Client (http://code.google.com/p/google-api-php-client/) to make OAuth requests - to get a new Access Token.
I have cached the Refresh Token and use this to generate a new Access Token. I have gone over the documentation (https://developers.google.com/accounts/docs/OAuth2, https://developers.google.com/storage/docs/developer-guide) and it only talks about limits on the Refresh Token (one limit per client/user combination, and another per user across all clients) but nothing about Access Token limits (except for the fact that an Access Token is only valid for an hour).
I'm trying to calculate bucket size usage across thousands of buckets. I'm trying to parallelize this task to cut down on time - I do this by spawning a new process for each bucket and each process requests a new Access Token. I do this because of my assumption that there is no limit on the number of Access Tokens issued and because, for a bucket with lots and lots of objects, the calculation time + potential exponential backoff time could theoretically exceed the lifetime of the Access Token.
But when I try to do this, I see this error:
Error No: 1
Error on Line: 242
Error Message: Uncaught exception 'apiAuthException' with message 'Error refreshing the OAuth2 token, message:
<HTML>
<HEAD>
<TITLE>User Rate Limit Exceeded</TITLE>
</HEAD>
<BODY BGCOLOR="#FFFFFF" TEXT="#000000">
<H1>User Rate Limit Exceeded</H1>
<H2>Error 403</H2>
</BODY>
</HTML>
Is this because I'm spawning a lot (16 at the moment) of Access Tokens?
If not, what then, is causing this error? What's the best way to get around this error?
Is there a Google Documentation page that documents the User Rate Limits?