0
votes

I am working on a local dev environment with Storage Explorer (connected to local emulated storage) and my webjob is triggered on new queue messages. For testing, I publish 100 queue messages and my webjob function prints a counter value to the console log:

        Interlocked.Increment(ref counter);
        log.WriteLine($"counter: {counter}");

(counter being a static int)

It takes 30 seconds to go through 100 messages. Is rate/speed expected? Is there any way to make it faster, considering that the function's operation is rather simple and doesn't write to DB/table?

I am posting this in relation to my original question to which currently there's no solution: Slow azure queue webjob performance (local dev)

2
Are you running against the local Storage Emulator? Or are you running locally against an actual Azure Storage Queue? - Rob Reagan
running locally against storage emulator - Alex E

2 Answers

2
votes

The local Storage Emulator is in no way indicative of the performance you'll see against a real Azure Storage Queue. The local Storage Emulator uses a local SQL Server instance that it creates behind the scenes to mimic storage services. It's slow in comparison. It also has limited support for concurrency.

To get a true test, provision a Storage Queue in Azure. For best performance, you can avoid network latency by running your processes that are enqueueing and dequeueing in the same Azure datacenter.

0
votes

It depends on the size of your message of course, but I suspect the infrastructure and hardware plays a role as well. What app service plan is your webjob running on and how big are your messages?

According to the docs https://docs.microsoft.com/en-us/azure/storage/storage-performance-checklist#queues

A single queue can process approximately 2,000 messages (1KB each) per second (each AddMessage, GetMessage, and DeleteMessage count as a message here).

Without more details like the full code of what your process is it is hard to tell but in theory you should be able to get more throughput.