I'm currently working on a data warehousing project with BigQuery.
BQ used to have this quota:
Daily destination table update limit — 1,000 updates per table per day
While this quota is still in the documentation, I understand that his has been removed according to this blog post:
https://cloud.google.com/blog/products/data-analytics/dml-without-limits-now-in-bigquery
In our project we need live data for which requires a lot of updates. Before this blog post I would have gathered the records e.g. on GCS and pushed them every ~14 minutes into BQ.
With the removal of the table update limit, we could now stream all data immediately into BQ which would be actually vital for our solution as live data are required.
Question: Would you recommend now to stream data directly to BQ? Any objections?
I'm asking this as I think just because the quota has been removed, this doesn't automatically become a best practice. How do you handle the requirement for live data? Another option before has been external data sources with the known limitations.
Thank you for your answers!