1
votes

json_data = '["xxxxxx",65465464.0,2,-1,10.10]'

Schema of bigquery table:

id STRING NULLABLE
timestamp STRING NULLABLE
xid INTEGER NULLABLE
yid INTEGER NULLABLE
magnitude FLOAT NULLABLE

script.py:

data = json.loads(json_data)
table.reload()
rows = [data]
errors = table.insert_data(rows)

Error:

errors = table.insert_data(rows)
File "/usr/local/lib/python2.7/dist-    
packages/google/cloud/bigquery/table.py", line 749, in insert_data
value = _microseconds_from_datetime(value) * 1e-6
File "/usr/local/lib/python2.7/dist-packages/google/cloud/_helpers.py", line     
363, in _microseconds_from_datetime
if not value.tzinfo:
AttributeError: 'float' object has no attribute 'tzinfo'

Does anyone has any idea about this error?

Appreciated!

1
Without seeing your data it's hard to know for sure but it looks like BigQuery is expecting something to be a datetime.datetime object and you're passing a float. - fdsa
@fdsa Basically, I have a time-stamp data in UNIX epoch which you can see as float type in json record. When I trigger batch job to load the json file of this data , it works fine without any conversion of this float type to Datetime. But for the same json records, When i am trying to insert using big-query API at a time(as shown above), it's throwing an above error. Any pointers would be helpful ! - Revan

1 Answers

1
votes

@fdsa is correct. When you upload a batch file, it will accept floats and strings because you can't store a datetime in the file. But to use the insert_data command you BQ expects the timestamp to be either a datetime.datetime object so load the UNIX timestamp into a datetime before inserting:

 datetime.datetime.fromtimestamp(your_unix_timestamp)