Azure Data Factory can execute custom activities as Batch Service Jobs. These jobs can run from an .exe (and associated dependencies) in a storage account which are copied across prior to execution.
There is a limitation on the files in the storage account that can be used:
Total size of resourceFiles cannot be more than 32768 characters
The solution appears to be to zip the files in the storage account and unzip as part of the command. This post suggests running the Batch Service Command in Azure Data Factory as:
Unzip.exe [myZipFilename] && MyExeName.exe [cmdLineArgs]
Running this locally on a Windows 10 machine works fine. Setting this as the Command parameter on the batch service custom activity (using a Cloud Services Windows Server 2019 OS Image App Pool) results in:
caution: filename not matched: &&
It feels like something basic that I'm missing but I've tried various permutations and cannot get it to work.
storageContainerUrl
property, rather than specifying each individuality. Assuming some of the files are in the same container, this should drastically reduce the number of characters in your request. – brklein