2
votes

When trying to insert data into GoogleBigQuery, we are getting the following error:

table.write: Exceeded rate limits: too many table update operations for this table. For more information, see https://cloud.google.com/bigquery/troubleshooting-errors (error code: rateLimitExceeded)

As per documentation, I may be exceeding one of the below

How do I know which criteria is exceeded by my application?

I have already explored other solutions in web and none worked.

4
Isn't the thousand updates per table and day obvious from "too many table update operations"?U. Windl

4 Answers

3
votes

One thing you could check is your Quotas page (Navigation Menu -> IAM & Admin -> Quotas) then under the Service you could select only BigQuery API to see if you're hitting any BQ API quota. If not, you're most likely hitting the "Daily destination table update limit — 1,000 updates per table per day"

1
votes

You're reaching the table update limit. This means that you're submitting lots of operations that modifies the storage of your table (insert, update or delete). Keep in mind that this also includes load jobs, DMLs or queries with destination tables. Since the quota replenishes periodically, you would have to wait a couple of minutes to try again, but taking care of your table update quota so you don't get this error again.

If you're inserting rows in a lot of operations instead of a few ones, consider using Streaming Inserts instead.

1
votes

Let me reproduce the error with a real case I got from a teammate:

# create the table
CREATE TABLE temp.bucket_locations
AS 
SELECT 'ASIA-EAST1' bucket_location
UNION ALL SELECT 'ASIA-NORTHEAST2' bucket_location;

#update several times
UPDATE temp.bucket_locations
 SET bucket_location = "US"
 WHERE UPPER(bucket_location) LIKE "US%";
UPDATE temp.bucket_locations
 SET bucket_location = "TW"
 WHERE UPPER(bucket_location) LIKE "ASIA-EAST1%";
UPDATE temp.bucket_locations
 SET bucket_location = "JP"
 WHERE UPPER(bucket_location) LIKE "ASIA-NORTHEAST1%";
UPDATE temp.bucket_locations
 SET bucket_location = "HK"
 WHERE UPPER(bucket_location) LIKE "ASIA-EAST2%";
UPDATE temp.bucket_locations
 SET bucket_location = "JP"
 WHERE UPPER(bucket_location) LIKE "ASIA-NORTHEAST2%";
UPDATE temp.bucket_locations
 SET bucket_location = "KR"
 WHERE UPPER(bucket_location) LIKE "ASIA-NORTHEAST3%";
UPDATE temp.bucket_locations
 SET bucket_location = "IN"
 WHERE UPPER(bucket_location) LIKE "ASIA-SOUTH1%";
UPDATE temp.bucket_locations
 SET bucket_location = "SG"
 WHERE UPPER(bucket_location) LIKE "ASIA-SOUTHEAST1%";
UPDATE temp.bucket_locations
 SET bucket_location = "AU"
 WHERE UPPER(bucket_location) LIKE "AUSTRALIA%";
UPDATE temp.bucket_locations
 SET bucket_location = "FI"
 WHERE UPPER(bucket_location) LIKE "EUROPE-NORTH1%";
UPDATE temp.bucket_locations
 SET bucket_location = "BE"
 WHERE UPPER(bucket_location) LIKE "EUROPE-WEST1%";
UPDATE temp.bucket_locations
 SET bucket_location = "GB"
 WHERE UPPER(bucket_location) LIKE "EUROPE-WEST2%";
UPDATE temp.bucket_locations
 SET bucket_location = "DE"
 WHERE UPPER(bucket_location) LIKE "EUROPE-WEST3%";
UPDATE temp.bucket_locations
 SET bucket_location = "NL"
 WHERE UPPER(bucket_location) LIKE "EUROPE-WEST4%";
UPDATE temp.bucket_locations
 SET bucket_location = "CH"
 WHERE UPPER(bucket_location) LIKE "EUROPE-WEST6%";
UPDATE temp.bucket_locations
 SET bucket_location = "CA"
 WHERE UPPER(bucket_location) LIKE "NORTHAMERICA%";
UPDATE temp.bucket_locations
 SET bucket_location = "BR"
 WHERE UPPER(bucket_location) LIKE "SOUTHAMERICA%";

Exceeded rate limits: too many table update operations for this table

The solution for this case is avoiding doing so many updates. Instead, we can do only one, by combining all the mappings together:

CREATE TEMP TABLE `mappings`
AS 
SELECT *
FROM UNNEST(
  [STRUCT('US' AS abbr, 'US%' AS long), ('TW', 'ASIA-EAST1%'), ('JP', 'ASIA-NORTHEAST2%'
  # add mappings
)]);

UPDATE temp.bucket_locations
 SET bucket_location = abbr
 FROM mappings 
 WHERE UPPER(bucket_location) LIKE long
0
votes

In terms of a solution, use await bigquery.createJob(jobConfig); instead of await bigquery.createQueryJob(jobConfig); The former will run as a batch while the latter is an interactive query job.

Running the query as a batch will not count against BigQuery API limits.

From GCP documentation:

By default, BigQuery runs interactive query jobs, which means that the query is executed as soon as possible. Interactive queries count toward your concurrent rate limit and your daily limit.

Batch queries don't count towards your concurrent rate limit

I was running a MERGE query to deduplicate and using a batch process solved the errors. I didn't see any discernable difference in processing time.