I'm following Python Client Libraries for the Google BigQuery API - https://googlecloudplatform.github.io/google-cloud-python/stable/bigquery/usage.html#jobs > Querying data (asynchronous)
when it comes to Retrieve the results, executing the code:
rows, total_count, token = query.fetch_data() # API requet
always return ValueError: too many values to unpack (expected 3)
(btw I think there's a typo, it should be results.fetch_data() instead!)
However, the following code works fine
results = job.results()
rows = results.fetch_data()
tbl = [x for x in rows]
All the rows of the table are return (as list of tuples) in tbl in a singel shot, >225K rows!
Can anyone why I get the error, or is there anything wrong in the doc?
How can I still retrieve results in batches (iterating through page by page)
Thanks a lot in advance!