I have a ASP.Net application in which I am trying to insert rows in google bigquery through streaming (tabledata.insertAll()). I am doing this by HTTP POST request and in the request body, i am supplying data with the following structure:
{ "kind": "bigquery#tableDataInsertAllRequest", "rows": [ { "insertId": string, "json": { (key): (value) } } ] }
When I am passing more than 100 rows (such as 101) in the request body, it gives me 400 bad request error. But when I pass 100 rows or less than 100, then it works fine with no errors.
Is there any limit of rows while using streaming?