Using Google's bigquery Python API, is it possible to fetch data from from big query table (GCP) in batches without repetition (i.e., downloading a large dataset in small batches rather than all at once)?
For example, if I have a table with 10 million rows, can I run 10 iterations of data fetching where in each iteration 1 million unique and new rows are downloaded without repetition (i.e., the same row is fetched only once across all 10 iterations)?