0
votes

I've an Azure function which uses an event hub trigger to post data to Blob storage. I'm reading the incoming payload to determine the folder structure of the Blob. I'm inserting some values from the payload into a MS SQL Database. However, these values have to be inserted every hour and not on every trigger. How can I achieve this?

I'm reading the incoming message like this:

var msg = JsonConvert.DeserializeObject<DeviceInfo>(Convert.ToString(myEventHubMessage));

and storing in the blob:

using (var writer = binder.Bind<TextWriter>(new BlobAttribute(path)))
{
     writer.Write(myEventHubMessage);
}

Here I check to see if the record has been inserted in the Database. If not I insert it. But the method CurrentTimeUnprocessed() makes a call to the DB on every request. I don't want to do it.

if (CurrentTimeUnprocessed(parameter_array) == 0)
     AddToUnprocessed(parameter_array);

What's the best way to achieve this?

1
Can you separate the tasks and have the triggered function pile data as messages in a queue and have a second function triggered hourly that depiles the queue and insert the data in SQL? - CSharpRocks
@CSharpRocks event hub receives a lot of messages from a variety of devices. Some every second, some every 5 seconds and some every few minutes. I think it will clog the queue before the other function triggers. - MAK
The data that you store in SQL, is it an aggregation of the data that the function processed in the hour or is it raw? - CSharpRocks
what about using a TimerTrigger function that runs every hour ? - Thomas
@CSharpRocks it is a raw version of the data. - MAK

1 Answers

0
votes

Azure functions don't maintain any state, so you would have to store the timestamp somewhere.

Is there any reason why you don't want to check the database on every request? This could be a very lightweight query that is very fast and simply checks the timestamp on a table.

If this is not an option, another alternative would be to use something like Redis with an expiring key.