1
votes

I have two Azure Function apps. The first is written in C# and picks up files from API requests or through SFTP to retrieve various files and directories per client and moves the files to Azure Blob Storage. The second app is written in Python and relies on timer triggers to pick up those files every day, process them, and push the cleaned data to an Azure SQL Server.

I would like to process these files in real-time instead of relying on timer triggers. For example, if client one has 10 directories and one file in each directory, once all of those files are picked up from the C# Azure Function, I would like the Python Azure Function to automatically pick those files up and do data processing. What would be the best way to achieve this? Should the C# application post to HTTP and then the Python application uses an HTTP trigger to determine when all the files are ready to be processed? I'm having a hard time finding documentation that shows examples of this scenario.

1

1 Answers

1
votes

You simply need to use a BlobTrigger input binding for your second function instead of a TimerTrigger.

You can trigger a function to run whenever a file/blob is added to specific container in blob storage. Since your first function saves a file to blob storage, just update the Input trigger binding for your second function to be a blob trigger instead of a timer trigger. This will make sure that the second function will run and process the file as soon as it is added to blob storage. You can use wildcards to specify the files to look for when triggering using a blob input binding. Example : samples-workitems/{name} where samples-work-items is your container name and {name} is any file in that container.

For a working example of using BlobTriggers see : https://docs.microsoft.com/en-us/learn/modules/execute-azure-function-with-triggers/8-create-blob-trigger