0
votes

Am using spark bigquery connector to read data from Bigquery. https://cloud.google.com/dataproc/docs/tutorials/bigquery-connector-spark-example Need to check if a table exists before reading from the table . Otherwise the API is throwing the error

"Not found: Table sample_proj:sample_dataset.table"

Is there a way we can handle this in spark bigquery connector.

Thanks

1
If the table is so important, then you can check whether the table exists and fail the process if it doesn't exists. If exists then you can start spark bigquery operation. Please share if you found any other option.Never_Give_Up

1 Answers

1
votes

As of now error's in bigquery (ex: table does not exists or permission issues) will not make the spark application exit or stop. Which will be a problem. So to avoid you can split the task into two like checking whether table exists or not. And spark processing into separate task.