We are using google-php-client-api in order to stream web sites page views logs into a table with 9 columns. (formed of basic data types as
- cookieid(string),
- domain(string),
- site_category(string),
- site_subcategory(string),
- querystring(string),
- connectiontime(timestamp),
- flag(boolean),
- duration(integer),
- remoteip(string))
After 10 hours or running the scripts, we observed that bigquery api usage (for insertAll methods) became 300K but during that time 35K rows were only recorded to the table...
When we looked to the google cloud console, approximately 299K of this 300K api usage returned "success codes"; what i mean the streaming seemed to work well.
What we didn't understand, after 299K successful requests, how only 35K rows should be inserted to the table?
Is this a problem caused because of the google-php-client-api or bigquery didn't save the sent data to the table yet?
If the second is true, how much time do we need to see the actual (all of the) rows sent to bigquery?
Code used for streaming data:
$rows = array();
$data = json_decode($rawjson);
$row = new Google_Service_Bigquery_TableDataInsertAllRequestRows();
$row->setJson($data);
$row->setInsertId(strtotime('now'));
$rows[0] = $row;
$req = new Google_Service_Bigquery_TableDataInsertAllRequest();
$req->setKind('bigquery#tableDataInsertAllRequest');
$req->setRows($rows);
$this->service->tabledata->insertAll($projectid, $datasetid, $tableid, $req);
Thank you in advance,
Cihan