I'm attempting to deploy an azure data factory with a copy data pipeline that pulls files from one or more deployed / on-prem file system paths and dumps them in blob storage. The source file paths on the file system may span multiple different drives (e.g. - C:\fileshare1 vs D:\fileshare2) and may include network locations referenced via UNC paths (e.g. - \localnetworkresource\fileshare3).
I'd like to configure a single local file system connection and source data set and just parameterize the linked service's host property. Then my pipeline would just iterate over a collection of file share paths and reuse the dataset and linked service connection. However, it doesn't look like there's any way to have the data set or pipeline provide the host information to the linked service. It's certainly possible to provide folder information from the pipeline and dataset, but that will be concatenated to the host specified in the linked service connection and therefore won't allow me access to different drives or network resources.
It was reasonably straightforward to do this by configuring separate linked service connections, data sets and pipelines for each distinct file share that needed to be included, but I'd prefer to manage a single pipeline.
I already tried to create the JSON of the linked services but it didn't work, someone who can help me?
https://docs.microsoft.com/en-us/azure/data-factory/parameterize-linked-services