0
votes

I can create & run a GCP job [using PHP API] to query a Google BigQuery dataset and export/save the result set as CSV file to the Google Cloud Storage GCS.

This works fine, but when the result set is huge, it takes a while to move the data to the GCS.

Is there a way to get the progress or estimated time of completion ?

Or alternately any method to find out how many MB/GB of data has been successfully moved to GCS so far?

1

1 Answers

1
votes

It’s not possible to retrieve information about the quantity of data transferred or the estimated time that the job will finish, however using the BigQuery Client Library for PHP you can request the id of currently running jobs. For example:

$bigQuery = new BigQueryClient([
    'projectId' => $projectId,
]);

$jobs = $bigQuery->jobs([
    'stateFilter' => 'running'
]);

foreach ($jobs as $job) {
      echo $job->id() . PHP_EOL;
}

In this way you can ensure that your job is still underway.