0
votes

I'm using an Azure environment and developing in .NET

I am running a web app (ClientApp) that takes client data to perform a series of calculations. The calculations are performance intensive, so they are running on a separate web app (CalcApp).

Currently, the ClientApp sends the calculation request to the CalcApp. The requests from every client are put into a common queue and run one at a time, FIFO. My goal is to create separate queues for each client and run several calculations concurrently.

I am thinking of using the Azure Service Bus queues to accomplish this. On the ClientApp, the service bus would check for an existing queue for that client and create one if needed. On the CalcApp, the app would periodically check for existing queues. If it finds a new queue, then it would create a new QueueClient that uses OnMessageAsync() and RunCalculationsAsync() as the callback function.

Is this feasible or even a good idea?

1

1 Answers

0
votes

I would consider using multiple consumers instead, perhaps with a topic denoting the "client" if you need to differentiate the type of processing based on which client originated it. Each client can add an entry into the queue, and the consumers "fight" over the messages. There is no chance of the same message being processed twice if you follow this approach.

I'm not sure having multiple queues is necessary.

Here is more information on the Competing Consumers pattern.
https://msdn.microsoft.com/en-us/library/dn568101.aspx

You could also build one consumer and spawn multiple threads. In this model, you would have one queue and one consumer, but still have the ability to calculate more than one at a time. Ultimately, though, competing consumers is far more scalable, using a combination of both strategies.