3
votes

I want to analyze the activity on BigQuery during the past month.

I went to the cloud console and the (very inconvenient) log viewer. I set up exports to Big-query, and now I can run queries on the logs and analyze the activity. There is even very convenient guide here: https://cloud.google.com/bigquery/audit-logs.

However, all this helps to look at data collected from now on. I need to analyze past month.

Is there a way to export existing logs (rather than new) to Bigquery (or to flat file and later load them to BQ)?

Thanks

3

3 Answers

1
votes

While you cannot "backstream" the BigQuery's logs of the past, there is something you can still do, depending on what kind of information you're looking for. If you need information about query jobs (jobs stats, config etc), you can call Jobs: list method of BigQuery API to list all jobs in your project. The data is preserved there for 6 months and if you're project owner, you can list the jobs of all users, regardless who actually ran it.

If you don't want to code anything, you can even use API Explorer to call the method and save the output as json file and then load it back into BigQuery's table.

enter image description here

Sample code to list jobs with BigQuery API. It requires some modification but it should be fairly easy to get it done.

1
votes

You can use Jobs: list API to collect job info and upload it to GBQ
Since it is in GBQ - you can analyze it any way you want using power of BigQuery
You can either flatten result or use original - i recommend using original as it is less headache as no any transformation before loading to GBQ (you just literally upload whatever you got from API). Of course all this in simple app/script that you still have to write

Note: make sure you use full value for projection parameter

0
votes

I was facing the same problem when I found a article which describes how to inspect Big Query using INFORMATION_SCHEMA without any script nor Jobs: list as mentioned by other OPs. I was able to run and got this working.


# Monitor Query costs in BigQuery; standard-sql; 2020-06-21
# @see http://www.pascallandau.com/bigquery-snippets/monitor-query-costs/

DECLARE timezone STRING DEFAULT "Europe/Berlin";
DECLARE gb_divisor INT64 DEFAULT 1024*1024*1024;
DECLARE tb_divisor INT64 DEFAULT gb_divisor*1024;
DECLARE cost_per_tb_in_dollar INT64 DEFAULT 5;
DECLARE cost_factor FLOAT64 DEFAULT cost_per_tb_in_dollar / tb_divisor;

SELECT
 DATE(creation_time, timezone) creation_date,
 FORMAT_TIMESTAMP("%F %H:%I:%S", creation_time, timezone) as query_time,
 job_id,
 ROUND(total_bytes_processed / gb_divisor,2) as bytes_processed_in_gb,
 IF(cache_hit != true, ROUND(total_bytes_processed * cost_factor,4), 0) as cost_in_dollar,
 project_id,
 user_email,
FROM 
  `region-us`.INFORMATION_SCHEMA.JOBS_BY_USER
WHERE
  DATE(creation_time) BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY) and CURRENT_DATE()
ORDER BY
  bytes_processed_in_gb DESC

Credits: https://www.pascallandau.com/bigquery-snippets/monitor-query-costs/