2
votes

The Microsoft process looks like a batch import method of copying data from SQL Server into Azure Data Warehouse.

enter image description here

Is there a simpler method, conducting every second of streaming data from MS SQL Server into Datawarehouse. This seems overly complicated with two ETL steps, (Azure Data Factory, and then Polybase) . Can we continually stream data from SQL Server into Data Warehouse? (We know AWS allows streaming of data from SQL server into Redshift DW). Stream Data from SQL Server into Redshift

https://azure.microsoft.com/en-us/services/sql-data-warehouse/

2
I mean Polybase is a checkbox in Data Factory so there isn't a great deal of extra work for you to do. The reason it's there is because it's the recommended fastest way to get data into Azure SQL Data Warehouse. ADW is an expensive product to keep running 24x7 so you could just 'stream' (copy) your data into Data Lake, and create an external Polybase table over that. Materialise it into the warehouse using CTAS for performance if required when the warehouse starts. - wBob

2 Answers

0
votes

Your link does not explain how to stream data from SQL Server to Redshift; rather it suggests a data migration service for continually ETL'ing data from source to destination. If this is your aim, your can write an SSIS package (i.e. migration service) and schedule it on a continuing basis using Azure Data Factory.

-1
votes

Create a DFT in SSIS package with OLEDB Source and Destination. Schedule a job in SSMS. This much simpler.