I'm getting 403 rateLimitExceeded
errors while doing streaming inserts into BigQuery. I'm doing many streaming inserts in parallel, so while I understand that this might be cause for some rate limiting, I'm not sure what rate limit in particular is the issue.
Here's what I get:
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "Exceeded rate limits: Your table exceeded quota for rows. For more information, see https://cloud.google.com/bigquery/troubleshooting-errors",
"reason" : "rateLimitExceeded"
} ],
"message" : "Exceeded rate limits: Your table exceeded quota for rows. For more information, see https://cloud.google.com/bigquery/troubleshooting-errors"
}
Based on BigQuery's troubleshooting docs, 403 rateLimitExceeded
is caused by either concurrent rate limiting or API request limits, but the docs make it sound like neither of those apply to streaming operations.
However, the message
in the error mentions table exceeded quota for rows
, which sounds more like the 403 quotaExceeded
error. The streaming quotas are:
- Maximum row size: 1 MB - I'm under this - my average row size is in the KB and I specifically limit sizes to ensure they don't hit 1MB
- HTTP request size limit: 10 MB - I'm under this - my average batch size is < 400KB and max is < 1MB
- Maximum rows per second: 100,000 rows per second, per table. Exceeding this amount will cause quota_exceeded errors. - can't imagine I'd be over this - each batch is about 500 rows, and each batch takes about 500 milliseconds. I'm running in parallel but inserting across about 2,000 tables, so while it's possible (though unlikely) that I'm doing 100k rows/second, there's no way that's per table (more like 1,000 rows/sec per table max)
- Maximum rows per request: 500 - I'm right at 500
- Maximum bytes per second: 100 MB per second, per table. Exceeding this amount will cause quota_exceeded errors. - Again, my insert rates are not anywhere near this volume by table.
Any thoughts/suggestions as to what this rate limiting is would be appreciated!