43
votes

I'm using a Microsoft azure service bus queue to process calculations and my program runs fine for a few hours but then I start to get this exception for every message that I process from then on. I have no clue where to start since everything runs fine for the first few hours. My code seems to be accurate as well. I will post the method where I handle the azure service bus message.

public static async Task processCalculations(BrokeredMessage message)
    {
        try
        {
            if (message != null)
            {
                if (connection == null || !connection.IsConnected)
                {
                    connection = await ConnectionMultiplexer.ConnectAsync("connection,SyncTimeout=10000,ConnectTimeout=10000");
                    //connection = ConnectionMultiplexer.Connect("connection,SyncTimeout=10000,ConnectTimeout=10000");
                }

                cache = connection.GetDatabase();

                string sandpKey = message.Properties["sandp"].ToString();
                string dateKey = message.Properties["date"].ToString();
                string symbolclassKey = message.Properties["symbolclass"].ToString();
                string stockdataKey = message.Properties["stockdata"].ToString();
                string stockcomparedataKey = message.Properties["stockcomparedata"].ToString();

                var sandpTask = cache.GetAsync<List<StockData>>(sandpKey);
                var dateTask = cache.GetAsync<DateTime>(dateKey);
                var symbolinfoTask = cache.GetAsync<SymbolInfo>(symbolclassKey);
                var stockdataTask = cache.GetAsync<List<StockData>>(stockdataKey);
                var stockcomparedataTask = cache.GetAsync<List<StockMarketCompare>>(stockcomparedataKey);

                await Task.WhenAll(sandpTask, dateTask, symbolinfoTask,
                    stockdataTask, stockcomparedataTask);

                List<StockData> sandp = sandpTask.Result;
                DateTime date = dateTask.Result;
                SymbolInfo symbolinfo = symbolinfoTask.Result;
                List<StockData> stockdata = stockdataTask.Result;
                List<StockMarketCompare> stockcomparedata = stockcomparedataTask.Result;

                StockRating rating = performCalculations(symbolinfo, date, sandp, stockdata, stockcomparedata);

                if (rating != null)
                {
                    saveToTable(rating);
                    if (message.LockedUntilUtc.Minute <= 1)
                    {
                        await message.RenewLockAsync();
                    }
                    await message.CompleteAsync(); // getting exception here
                }
                else
                {
                    Console.WriteLine("Message " + message.MessageId + " Completed!");
                    await message.CompleteAsync();
                }
            }
        }
        catch (TimeoutException time)
        {
            Console.WriteLine(time.Message);
        }
        catch (MessageLockLostException locks)
        {
            Console.WriteLine(locks.Message);
        }
        catch (RedisConnectionException redis)
        {
            Console.WriteLine("Start the redis server service!");
        }
        catch (MessagingCommunicationException communication)
        {
            Console.WriteLine(communication.Message);
        }
        catch (Exception ex)
        {
            Console.WriteLine(ex.Message);
            Console.WriteLine(ex.StackTrace);
        }
    }

UPDATE: I check the time until the lock expiration and I call lock renew if it needs it but it renews the lock with no errors but I'm still getting this exception.

timeLeft = message.LockedUntilUtc - DateTime.UtcNow;
  if (timeLeft.TotalMinutes <= 2)
                    {
                        //Console.WriteLine("Renewed lock! " + ((TimeSpan)(message.LockedUntilUtc - DateTime.UtcNow)).TotalMinutes);
                        message.RenewLock();
                    }

catch (MessageLockLostException locks)
        {
            Console.WriteLine("Delivery Count: " + message.DeliveryCount);
            Console.WriteLine("Enqueued Time: " + message.EnqueuedTimeUtc);
            Console.WriteLine("Expires Time: " + message.ExpiresAtUtc);
            Console.WriteLine("Locked Until Time: " + message.LockedUntilUtc);
            Console.WriteLine("Scheduled Enqueue Time: " + message.ScheduledEnqueueTimeUtc);
            Console.WriteLine("Current Time: " + DateTime.UtcNow);
            Console.WriteLine("Time Left: " + timeLeft);
        }

All I know so far is that my code runs fine for awhile and the renew lock gets called and works but I'm still getting the lock exception and inside that exception, I output the timeleft and it keeps increasing the time difference as the code runs which makes me believe that the time until lock expiration is not being changed somehow?

5
Did you ever resolve this? What was it?Fyodor Soikin
@FyodorSoikin No I eventually had to give up and work on a different method to do the same thing. From what I could tell, it is a bug in the api but no one from Microsoft ever responded to my postsDarthVegan
By any chance could any of your individual messages take longer than 60 seconds to process?DalSoft
@DalSoft I would renew the lock after the intensive calculation was finished and right before it processed the messageDarthVegan
@user3610374 have you tried setting the LockDuration to 5 minutes (which is the max setting), the default is 60 seconds and this exception will be thrown if 60 seconds pass before lock renewals.DalSoft

5 Answers

33
votes

I spent hours trying understand why I was getting a MessageLockLostException. The reason for me was due to AutoComplete defaulting to true.

If you're going to call messsage.Complete() (or CompleteAsync()) then you should instantiate an OnMessageOptions object, set AutoComplete to false, and pass it into your OnMessage call.

var options = new OnMessageOptions();
options.AutoComplete = false;

client.OnMessage(processCalculations, options);
13
votes

I was having a similar issue. Messages were being handled successfully, but when they went to complete, the Service Bus didn't have a valid lock anymore. It turns out my TopicClient.PrefetchCount was too high.

It appears that the lock begins on all prefetched messages as soon as they are fetched. If your cumulative message processing time surpasses the lock timeout every other prefetched message will fail to complete. It will return to the service bus.

1
votes

Instead of renewing the lock manualy, when you create the client subscription, try auto renewing it using the client's OnMessageOptions() like this:

        OnMessageOptions options = new OnMessageOptions();

        options.AutoRenewTimeout = TimeSpan.FromMinutes(1);

        try
        {
            client = Subscription.CreateClient();

            client.OnMessageAsync(MessageReceivedComplete, options);
        }
        catch (Exception ex)
        {
            throw new Exception (ex);
        }
0
votes

In my case, it was just that I was working on a V2 on my local machine while I had the V1 already deployed up-and-running.

As the V1 was deployed in Azure (closer to the same Queue) and compiled in release mode (versus my local version in debug mode), the deployed version was always "winning" the concurrency for the queue resource.

That's why the message was no longer in the queue: It was consumed by the deployed version of my code. I know it is a little bit dumb.

0
votes

It took me 2 days to resolve similar issue - same exception.
This exception may have multiple reasons, I'll describe couple of config options that may help you stranger...

ServiceBus Queue or Topic-subscription config:

  • Message lock duration on Queue / Topic subscription is too low set it to approx. message processing time

ServiceBusClient options config:

  • tryTimeout is too short, set it to ~10s for diagnostics

ServiceBusProcessor options config:

  • AutoCompleteMessages is by default set to true, set it to false
  • PrefetchCount is too high, for diagnostics set it to 0
  • ReceiveMode set it to ServiceBusReceiveMode.PeekLock
  • MaxConcurrentCalls for diagnostics set it to 1

After finding correct values (optimized for a given system) I no longer observed any issues.