2
votes

We have a lot of jobs that have to run on our local computers, but we want to be able to run and manage those jobs from the cloud. These are not computer-admin type jobs. They are more about business-related tasks.

We have thought seriously about doing this a couple different ways:

  1. Logic Apps/Microsoft Flow can create a file in a folder on a local computer using the On-Premises Data Gateway. We could then use this file as a trigger for an on-premises script that is running constantly and watching that folder. However, this feels clunky since Logic Apps isn't triggering the script directly but only via a simple file creation event. This approach would also require us to use a single username/password combination in Logic Apps and remember to keep that password up to date.
  2. Azure Event Grid can now forward events from Azure to a Hybrid Connection which transfers that event to a specific port on a local machine. Theoretically we could have a PowerShell script monitoring that port and process the incoming event. To me, this seems like the best way to trigger a script on an on premises machine from the cloud, however I'm not sure if this will actually work the way I expect.

We have also looked at a few other ways we might be able to leverage Azure for this kind of thing:

  1. Azure Automation Hybrid Runbooks can trigger jobs on premises. However, this service seems to be useful mostly for administrative tasks, not daily processes.
  2. Azure DevOps can trigger scripts to run on an on-premises computer using Self-Hosted Agents. However, I don't think Azure DevOps is designed to trigger a production process on a set schedule. It's only meant for software development build pipelines.
  3. Azure Data Factory Integration Runtime allows you to move data from an on-premises SQL Server to the cloud. This seems like an ideal platform for moving data from on-premises to the cloud but I don't think Azure Data Factory can trigger an actual on-premise script from the cloud. I think it can only work with on-premise SQL Server.

So I'm trying to decide amongst these approaches, or see if there is a better way.

I think I'm going to try the Azure Event Grid approach and get the Hybrid Connection Manager installed on some local machines and then keep some PowerShell scripts running 24/7 to monitor the specified port. Once an event from Azure Event Grid is routed to the Hybrid Connection Manager, it will then route it to the port that PowerShell is listening to and PowerShell can then trigger the job that needs to be run on the local computer.

I really like that approach since I can now connect my local events on my local computer to Azure and third-party events using Azure Event Grid. To me that opens up a world of possibilities for integration amongst disparate systems. But before I take this approach, I want to make sure it is the best one.

1
What about Azure functions? Have you done any reading about them?noitse
I don't think Azure Functions has on-premises capabilities, from what I've seen. That would be ideal.user2363207
Azure functions can actually be run on-premise, but comes with limitations (e.g. Windows 10 or Server 2016). See Azure Functions RuntimeHans Vonn
@BarrettNashville what solution did you end up going with?Hans Vonn

1 Answers

1
votes

It is unclear what type of data you are going to be sending from the cloud. You need to think about

1) Do you want Pull or Push capability to trigger the tasks? I would recommend Pull if you want to carry out maintenance on your local computers. In addition to the fact your local computers have a limited capacity and can’t scale based on a load with Push.

2) Do you want to install additional Server software? Azure Service Bus doesn’t need any Gateway to work on premise.

Azure Service Bus can be implemented with a Windows Service, IIS always on application, it still gives you the flexibility of integrating with 3rd party software as it is can work with Logic Apps, Flow and Azure Functions can be used with Azure Service Bus.