0
votes

With an Azure Functions App on the Consumption plan, I have two timer triggered C# functions that take approximately 1 min to run.

They each of them run successfully if they run at different times. However, If I run them overlapping in time then the functions never finish and they reach timeout.

I cannot see where the concurrency issue could be coming from. The functions read data from Azure SQL via EF and write results to different blobs on Azure Storage.

The two functions were implemented before as webjobs in a ASP.NET MVC web app (the Web App) hosted in Azure App Service and running correctly at the same time.

When I moved the two webjobs to the Azure Functions App I literally copied the content of the Web App bin folder into the Function App bin folder and then referenced the necessary dll in each of the functions.

One reason I can think for this concurrency issue is by copying the content of the Web App bin folder.

The other reason I can think is if the consumption plan is not budgeting the memory of each new function and thus scaling correctly. For example, if a new function triggered requires significant memory on itself, maybe should be assigned to new compute instance instead of sharing a previous one. Because my functions take a significant amount of memory, if the first function is assigned to a small compute instance and the second function is assigned to the same small compute instance is possible both together go over the limit causing the timeout.

EDIT

Additional results following suggestions from Matt Mason MSFT.

  • Using a second function app. Also in consumption mode and with same files in bin folder. I was able to run both functions at the same time on different function apps and they completed successfully. This makes me believe the issue should be coming from the function app and not on Azure SQL or Blob Storage dependencies.

However running the functions in separate apps is not an acceptable solution.

  • Results from running both functions on the same function app and monitoring Live Metrics Stream:

    a. Each function individually run successfully and shows CPU Total in the order of 70% to 97% and memory in the order 400 to 700 MB. See first image.

    b. Both functions at the same time. I see CPU at 110% and memory at 800 MB, then the metrics stream goes blank and I see. "No Servers Online". After a while I see one server online again but the two functions state is "Never finished". Is this a crash in the function app? See images 2, 3 and 4 below. one functiontwo functions at beginningtwo functions after some timeenter image description here

1
Are you using a shared (static) instance of your Db connection for EF? Functions running on the same underlying host share an AppDomain which might be causing some weirdness.Jesse Carter
Each function is casting its own instance of EF's DbContext. As additional result. I can run both functions at the same time on separate function apps (see EDIT above).donquijote

1 Answers

1
votes

Try using App Insights to profile your function:

https://github.com/Azure/Azure-Functions/wiki/App-Insights-(Preview)

You should be able to see performance counters per instance (how much memory use there is, etc), aggregated execution metrics, and execution logs.

If the functions absolutely cannot run together, try putting them in separate function apps so that they won't run on the same instance.


Update: It could be a crash of the function app. It also could be that CPU is so high that it impacts App Insights data upload - check whether the server id has changed.

You can use an ILogger parameter binding in your functions and log progress updates to see some execution traces in app insights analytics.

Unfortunately scaling does not work well in these scenarios, I suggest you file an issue at https://github.com/azure/azure-webjobs-sdk-script/issues to ask for better distribution of heavyweight timer functions across instances - however, your best bet is to use separate apps or add lease management/scheduling to ensure your functions do not run simultaneously.