I need to insert 1 million (and more) records from a SQL Server table to a BigQuery table, that is present in SQL Server as "linked server" via CDATA odbc driver with remoting daemon in it (documentation).
Also, source table might have no column with number of row, Id etc.
For now, I can insert 1 record per second into BigQuery with this driver, using this query:
INSERT INTO [GBQ].[CDataGoogleBigQuery].[GoogleBigQuery].[natality]
SELECT *
FROM [natality].[dbo].[natality]
GO
But for such a bunch of records, as 1 Million or more, it's an EXTREMELY LOW performance.
I believe there's a workaround, that can allow me to speed up insertion process. Thanks in advance, comrades.