1
votes

I have data in BigQuery from my Google Analytics account, along with some extra tables where I have transformed some of this data.

I would like to export some of my transformed data from BigQuery and import it into Google Analytics as a custom dimension.

I have done this manually, by downloading a CSV from my table in BigQuery and importing this using the GA admin UI. I would like to automate the process, but not sure where to start.

What would be the most efficient tool to automate this process? The process being:

  1. Run a SQL query on by BQ data every day and overwrite a table.

  2. Export this table as a file and upload it to a GA account as a query time import.

1

1 Answers

3
votes

Not sure why you'd want do this, but one (rather clunky) solution that pops into my head is to spin up a small GCE instance, and using the gcloud tool and some simple bash:

  1. Run a BigQuery query job (SQL) to truncate your table
  2. Monitor the progress of that query job i.e wait
  3. When it's finished, trigger an export job and dump the table to GCS
  4. Monitor the progress of that BigQuery export job i.e. wait
  5. When it's finished, download the file from GCS
  6. Upload the file to GA using the management API (https://developers.google.com/analytics/devguides/config/mgmt/v3/mgmtReference/management/uploads/uploadData)
  7. Schedule a cron job to run the above bash script daily

A nicer way would be to use Cloud Functions listening on the GCS bucket, but in my opinion, CFs are not designed for performing long running batch/data workloads. They have e.g. time limits (540s). Also, if GA supported direct load from GCS it would be much better. But, I wasn't able to find support for that.