0
votes

I have a ASP.Net application in which I am trying to insert rows in google bigquery through streaming (tabledata.insertAll()). I am doing this by HTTP POST request and in the request body, i am supplying data with the following structure:

{ "kind": "bigquery#tableDataInsertAllRequest", "rows": [ { "insertId": string, "json": { (key): (value) } } ] }

When I am passing more than 100 rows (such as 101) in the request body, it gives me 400 bad request error. But when I pass 100 rows or less than 100, then it works fine with no errors.

Is there any limit of rows while using streaming?

2

2 Answers

2
votes

The tableData.insertAll() has a :

  • Maximum row size: 100 KB
  • Maximum data size of all rows, per insert: 1 MB
  • Maximum rows per second: 100 rows per second, per table, with allowed and occasional bursts of up to 1,000 rows per second. If you
    exceed 100 rows per second for an extended period of time, throttling might occur.

Visit https://developers.google.com/bigquery/streaming-data-into-bigquery for the streaming quota policy.

1
votes

Update from https://developers.google.com/bigquery/streaming-data-into-bigquery :

The following limits apply for streaming data into BigQuery.

  • Maximum row size: 20 KB
  • Maximum data size of all rows, per insert: 1 MB
  • Maximum rows per second: 10,000 rows per second, per table. Exceeding this amount will cause quota_exceeded errors. For additional support up to 100,000 rows per second, per table, please contact a sales representative.
  • Maximum bytes per second: 10 MB per second, per table.
  • Exceeding this amount will cause quota_exceeded errors.

I think you gonna stop to get this kinda of errors.