2
votes

I'm working to lift an SSIS package into Azure Data Factory V2, and I have successfully set up an IR and executed the package.

Now I'm attempting to work with the results in ADF. This package was originally designed to return a recordset to the calling client. Now that I'm in ADF, I'd like to take the recordset produced by the package and copy it to table storage. However, I see no way to access this recordset from within the ADF pipeline.

Is it possible to access and process this recordset from the host ADF pipeline, or will the package itself have to be modified to no longer return a recordset and perform the copy instead?

1
No it's not possible to do this 'natively'. Returning a recordset from an SSIS package to be consumed by something else is definitely a non typical design. I would almost say it's an anti-pattern. How does the client currently consume the recordset? How does SSIS even return a recordset?Nick.McDermaid
Ha...this is my first foray into ETL development, so I have no answer for you. However, you did confirm my suspicion that this wasn't best practice, so thanks for that.maf748

1 Answers

1
votes

In the SSIS create a text file as output and copy it to a location/folder in blob or even your on premise folder.

https://www.powerobjects.com/blog/2018/11/20/uploading-azure-blob-ssis/

Now your Azure pipleline can access that BLOB as a source in the copy activity and dump it to the table storage as a sink.

Let me know if you need more details on the implementation.