0
votes

I'm trying to intentionally insert one dummy record with full schema data into bigquery GA360 table.

One way I figured it out is to "INSERT" all fields:

INSERT INTO <project.dataset.table> (visitorId,visitStartTime,date,totals,customDimension.......)
VALUES ( 1234, 12345, 20200101, ( 1,2,3,4,5,6,7,8,9,10,11,12,13),[(1,"asd"),....].....)

I gave up when I encountered "hits" field which has complicated nested structure.

Is there any other way (ex. something I can imagine is to code with client API using table schema JSON file) or anyone who has succeed with DDL?

1

1 Answers

3
votes

How would I solve this:

  1. Get a sample as JSON object:
SELECT TO_JSON_STRING(a)
FROM `bigquery-public-data.google_analytics_sample.ga_sessions_20170801` a
LIMIT 1
  1. Save that long JSON string to a file.

  2. Modify that file as much as you want.

  3. Create a new table to insert that file:

CREATE TABLE `temp.analy` AS
SELECT *
FROM `bigquery-public-data.google_analytics_sample.ga_sessions_20170801` 
LIMIT 0
  1. Insert your file with the manually modified values:

bq load --source_format=NEWLINE_DELIMITED_JSON temp.analy my.json

Or

5b. Create a federated table, so you could

INSERT INTO 'table'
SELECT * 
FROM `federated_table_that_reads_the_json_file`