1
votes

I want to load the data from Avro files stored in GCS to big query table. Two fields in avro are having datetimc

I have Avro files with couple of fields out of which one field has Epoch time values having Long type. For that field I want to load value in 'Timestamp' format while loading the Avro file data to Bigquery table using command line tool or console. For example : I have a column having value 1614004223589 and I want this value to be loaded as 2021-02-22 14:28:56 UTC using command line tool or console. First I created empty table in BQ and kept the datatype of the field as TIMESTAMP then using CLI loaded the data into BQ table. But instead of 2021-02-22 14:30:24 I am getting 1970-01-19 16:20:04.135924 UTC

Any leads will be appreciated.

1

1 Answers

1
votes

This is because Unix format with second precision, however BQ requires it to be in milli-second or micro-second precision multiply the column by 1000 and then try to concert using function TIMESTAMP_SECONDS it will work.

You can try out following solution to achieve desired requirements.

Step 1: Load data to Bigquery in same format as available over source file. Step 2: Create a View over recently table and apply Date transformations in that view, going further always refer View for data.

Alternate solution:- Step 1: Load data to Bigquery in same format as available over GCS. Step 2: Create a function for Date transformation and apply function on Unix format date, before you call data.