0
votes

I have a table in BigQuery that has 35 million rows that I want to turn into a Tableau extract, but the number of rows is so great that it can't add all of them at one time. The solution I have come up with is to first the number of rows that are presented by the BigQuery view by only showing rows within a particular date range for a full extract and then after the extract is completed slowly increase the number of rows displayed by the view (by altering the where statement to only show particular rows) and then having Tableau perform an incremental extract based on the field containing the time stamp. That worked for a few of the incremental updates, but now I've run into a problem where Tableau Desktop says 'Required columns are not present in the remote data source. Perform full refresh of the extract', except that nothing in the data source or within tableau has changed other than the WHERE clause affecting the range of dates that the view presents. This was apparently a problem in Tableau 8, but I'm on Tableau 9.3. Anyone have any suggestions?

1
Can you access the logs? It should show you what is being executed, so you may be able to identify the cause within those.Nick
Where would I find the logs to access them?Brad Davis
Wherever your repository is. For instance, it may be: C:\Users\username\Documents\My Tableau Repository\LogsNick

1 Answers

0
votes

I'm not sure if this will be a solution for everyone, but the problem that I was having seems to have been caused by making connections to Google BigQuery per hour.