1
votes

I have a Functions app on Consumption plan with a queue triggered function that does some data processing and writes to Azure SQL. Each run takes aprox 1 min and I need to process around 1000 messages arriving at one time. I worry about many of these runs hitting the database at the same time while I am ok letting the total time extend longer. This is how my host.json looks at the moment

{
    "functionTimeout": "00:09:50",
    "queues": {
      "maxPollingInterval": 5000,
      "visibilityTimeout" : "00:05:00",      
      "maxDequeueCount": 5,
      "batchSize": 4,
      "newBatchThreshold": 2
    }
}

I am more concerned with reliability than with the total time so I thought a way to improve the solution would be to limit the maximum number of concurrent consumption units and take longer to go through the 1000 messages. Any way to set a max number of concurrent consumption units? Other advise in improving reliability for queue triggered consumption functions would be appreciated.

2

2 Answers

3
votes

There is not setting like Max Concurrent Consumption Units at the moment.

Your best toggle to play with is batchSize which defines how many messages will be processed concurrently on a single instance (VM). However, if the application scales out to multiple instances, they will all run batchSize of items each.

So, the exact tuning is not strict, but usually you should be able to get to where you want to be by adjusting batchSize against your real workload.

Finally, you could use batchSize in combination with fixed App Service Plan, which gives you the exact guarantee but defeats the pay-per-execution benefit.

3
votes

Considering the lack of settings for Max Concurrent Consumption in standard Azure Functions, and to have complete control over the concurrent processing, you could consider using Azure Durable Functions with the appropriate pattern.

Since you are referring to a Run concern, I expect that you have (or could add) some kind of trigger in order to start your processing and have an asynchronous workflow started. If this is something possible, you could rely on the Azure Durable Functions Fan-Out/Fan-In pattern where you could control the level of parallelism you require. However, that would also mean that you need to manually dequeue the 1000 messages instead of relying on the binding to do it for you.

High level workflow example:

  1. Receive a trigger to have the run started through a binding (could be an Event Grid notification or adding a message to Queue), or use the Monitoring Pattern to poll an external resource.

  2. Start a Durable Function Workflow which dequeue all your messages and process them using the Fan-Out/Fan-In pattern or use a simpler approach where you process the 1000 messages sequentially without the Fan-Out.