0
votes

I am uploading my customer log files(csv format) into bigquery in 10 different projects and 10 different tables constantly by a C# App.

It is a catch up loading process. I keep downloading the files from server and generate them to bigquery load job requests every 5 minutes, then send to bigquery.

Also I sent the requests concurrently(25 request each time, wait for any request finished first, I generate a new task to handler the request, about 100 request for 5 minutes) It works fine for first 1 hour(0.4344 request/sec). However, after 1~2 hours, the loading speed dropped dramatically(0.0256 request/sec).Is there any particular reason why it happens?

How I can avoid the slow loading? Here is the 2 example job ID I can provide:job__ITLp_FzYf1gPvy9HhAwkQ42N3A job_4OoXPA4mGRBADX5LaVOrReecOyc

It seems all the jobs keep delay after the 2 hour period time because of the speed dropping.

Please let me know if I can give any details to my question, I am pretty new to Bigquery and google cloud applications. Thanks a lot in advance.

1

1 Answers

0
votes

Load jobs are intended for bulk data import and may have variable speed depending on overall system load.

If you want continuous low-latency ingestion of small amounts of data, consider using our streaming API instead:

https://cloud.google.com/bigquery/docs/reference/v2/tabledata/insertAll