We have a lot of jobs that have to run on our local computers, but we want to be able to run and manage those jobs from the cloud. These are not computer-admin type jobs. They are more about business-related tasks.
We have thought seriously about doing this a couple different ways:
- Logic Apps/Microsoft Flow can create a file in a folder on a local computer using the On-Premises Data Gateway. We could then use this file as a trigger for an on-premises script that is running constantly and watching that folder. However, this feels clunky since Logic Apps isn't triggering the script directly but only via a simple file creation event. This approach would also require us to use a single username/password combination in Logic Apps and remember to keep that password up to date.
- Azure Event Grid can now forward events from Azure to a Hybrid Connection which transfers that event to a specific port on a local machine. Theoretically we could have a PowerShell script monitoring that port and process the incoming event. To me, this seems like the best way to trigger a script on an on premises machine from the cloud, however I'm not sure if this will actually work the way I expect.
We have also looked at a few other ways we might be able to leverage Azure for this kind of thing:
- Azure Automation Hybrid Runbooks can trigger jobs on premises. However, this service seems to be useful mostly for administrative tasks, not daily processes.
- Azure DevOps can trigger scripts to run on an on-premises computer using Self-Hosted Agents. However, I don't think Azure DevOps is designed to trigger a production process on a set schedule. It's only meant for software development build pipelines.
- Azure Data Factory Integration Runtime allows you to move data from an on-premises SQL Server to the cloud. This seems like an ideal platform for moving data from on-premises to the cloud but I don't think Azure Data Factory can trigger an actual on-premise script from the cloud. I think it can only work with on-premise SQL Server.
So I'm trying to decide amongst these approaches, or see if there is a better way.
I think I'm going to try the Azure Event Grid approach and get the Hybrid Connection Manager installed on some local machines and then keep some PowerShell scripts running 24/7 to monitor the specified port. Once an event from Azure Event Grid is routed to the Hybrid Connection Manager, it will then route it to the port that PowerShell is listening to and PowerShell can then trigger the job that needs to be run on the local computer.
I really like that approach since I can now connect my local events on my local computer to Azure and third-party events using Azure Event Grid. To me that opens up a world of possibilities for integration amongst disparate systems. But before I take this approach, I want to make sure it is the best one.