5
votes

I have WebJob running continuously using the 1.0.0 WebJobs SDK.
Here is configuration code.

public static void Main()
{
    var config = new JobHostConfiguration();
    config.Queues.MaxDequeueCount = 1;
    config.Queues.BatchSize = 1;

    var host = new JobHost(config);
    host.RunAndBlock();
}

Here is my job function signature.

public static void HuntVkusniyBlogRss([QueueTrigger("queue-rss")] string message,
    DateTimeOffset expirationTime,
    DateTimeOffset insertionTime,
    DateTimeOffset nextVisibleTime,
    string id,
    string popReceipt,
    int dequeueCount,
    string queueTrigger,
    CloudStorageAccount cloudStorageAccount,
    TextWriter log)
{
    ...
}

I expect that after running function successefully, message is deleted from triggering queue. But I am getting function run more then once.

Logging shows that it is run on every nextVisibleTime and dequeueCount is incrementing each time.

UPD: It looks like error happens after function is executed. Here is logs from webjob dashboard:

[12/08/2014 13:10:47 > 51d5f4: INFO] Executing: 'Program.HuntVkusniyBlogRss' because New queue message detected on 'queue-rss'.
[12/08/2014 13:10:53 > 51d5f4: ERR ] 
[12/08/2014 13:10:53 > 51d5f4: ERR ] Unhandled Exception: Microsoft.WindowsAzure.Storage.StorageException: The remote server returned an error: (404) Not Found. ---> System.Net.WebException: The remote server returned an error: (404) Not Found.
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at Microsoft.WindowsAzure.Storage.Shared.Protocol.HttpResponseParsers.ProcessExpectedStatusCodeNoException[T](HttpStatusCode expectedStatusCode, HttpStatusCode actualStatusCode, T retVal, StorageCommandBase`1 cmd, Exception ex)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at Microsoft.WindowsAzure.Storage.Blob.CloudBlobSharedImpl.<DeleteBlobImpl>b__1b(RESTCommand`1 cmd, HttpWebResponse resp, Exception ex, OperationContext ctx)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.EndGetResponse[T](IAsyncResult getResponseResult)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    --- End of inner exception stack trace ---
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.EndExecuteAsync[T](IAsyncResult result)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob.EndDelete(IAsyncResult asyncResult)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at Microsoft.WindowsAzure.Storage.Core.Util.AsyncExtensions.<>c__DisplayClass4.<CreateCallbackVoid>b__3(IAsyncResult ar)
[12/08/2014 13:10:53 > 51d5f4: ERR ] --- End of stack trace from previous location where exception was thrown ---
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at Microsoft.Azure.WebJobs.Host.Protocols.PersistentQueueWriter`1.<DeleteAsync>d__6.MoveNext()
[12/08/2014 13:10:53 > 51d5f4: ERR ] --- End of stack trace from previous location where exception was thrown ---
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at Microsoft.Azure.WebJobs.Host.Loggers.CompositeFunctionInstanceLogger.<DeleteLogFunctionStartedAsync>d__e.MoveNext()
[12/08/2014 13:10:53 > 51d5f4: ERR ] --- End of stack trace from previous location where exception was thrown ---
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.<TryExecuteAsync>d__1.MoveNext()
[12/08/2014 13:10:53 > 51d5f4: ERR ] --- End of stack trace from previous location where exception was thrown ---
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueTriggerExecutor.<ExecuteAsync>d__0.MoveNext()
[12/08/2014 13:10:53 > 51d5f4: ERR ] --- End of stack trace from previous location where exception was thrown ---
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener.<ProcessMessageAsync>d__11.MoveNext()
[12/08/2014 13:10:53 > 51d5f4: ERR ] --- End of stack trace from previous location where exception was thrown ---
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at Microsoft.Azure.WebJobs.Host.Timers.BackgroundExceptionDispatcher.<>c__DisplayClass1.<Throw>b__0()
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Threading.ThreadHelper.ThreadStart_Context(Object state)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
[12/08/2014 13:10:53 > 51d5f4: ERR ]    at System.Threading.ThreadHelper.ThreadStart()
[12/08/2014 12:08:18 > 51d5f4: SYS ERR ] Job failed due to exit code -532462766
[12/08/2014 12:08:18 > 51d5f4: SYS INFO] Process went down, waiting for 0 seconds
[12/08/2014 12:08:18 > 51d5f4: SYS INFO] Status changed to PendingRestart
[12/08/2014 12:08:23 > 51d5f4: SYS INFO] Run script 'HunterWebJob.exe' with script host - 'WindowsScriptHost'
[12/08/2014 12:08:23 > 51d5f4: SYS INFO] Status changed to Running
[12/08/2014 12:08:25 > 51d5f4: INFO] Found the following functions:
[12/08/2014 12:08:25 > 51d5f4: INFO] Cookagg.HunterWebJob.Program.HuntVkusniyBlogRss
[12/08/2014 12:08:25 > 51d5f4: INFO] Job host started
3
Is your function throwing an exception? Can you show us the execution logs from the webjobs dashboard?Victor Hurdugaci
Funtion is not throwing exception. I added logs from dashboard.Alexey Veselovskiy
How long does your job takes to execute? Is it more than 10 minutes? How many consumers do you have for this queue?b2zw2a
Job runs continuously. Function takes less than 10 minutes to execute. Webjob is the only consumer for the queue.Alexey Veselovskiy
I'm having the exact same problem. Existing code started to reprocess existing and successfully completed jobs again. Only started in the last month I believe.TWith2Sugars

3 Answers

1
votes

I ran into this problem because container are not always auto-created. So, I created:

azure-jobs-host-output
azure-webjobs-hosts

0
votes

I had the exact same problem, I had a separate storage account for the data and logs. If I used the same storage account for both this bug stops.

0
votes

Check the webjobs SDK version, if that is lower than the 1.1.1.0, then ensure you are with latest version.

so it should be like below, we had the same issue with webjobs sdk version 1.0.1.0 and upgrading this to latest one fixed the issue.

  1. Microsoft.Azure.WebJobs.dll (in our case current version was 1.0.1.0 and we need 1.1.1.0)
  2. Microsoft.WindowsAzure.Storage.dll (current version is 4.2.1.0, we need (≥ 4.3.0 && ≤ 5.0.2).