5
votes

Just for fun, I wrote this code to simulate a deadlock. Then, I sat and watched it run patiently until the total number of available worker threads that the thread pool had went down to zero. I was curious to see what would happen. Would it throw an exception?

using System;
using System.Diagnostics;
using System.Threading;

namespace Deadlock
{
    class Program
    {
        private static readonly object lockA = new object();
        private static readonly object lockB = new object();

        static void Main(string[] args)
        {
            int worker, io;

            ThreadPool.GetAvailableThreads(out worker, out io);

            Console.WriteLine($"Total number of thread pool threads: {worker}, {io}");
            Console.WriteLine($"Total threads in my process: {Process.GetCurrentProcess().Threads.Count}");
            Console.ReadKey();

            try
            {
                for (int i = 0; i < 1000000; i++)
                {
                    AutoResetEvent auto1 = new AutoResetEvent(false);
                    AutoResetEvent auto2 = new AutoResetEvent(false);

                    ThreadPool.QueueUserWorkItem(ThreadProc1, auto1);
                    ThreadPool.QueueUserWorkItem(ThreadProc2, auto2);

                    var allCompleted = WaitHandle.WaitAll(new[] { auto1, auto2 }, 20);

                    ThreadPool.GetAvailableThreads(out worker, out io);
                    var total = Process.GetCurrentProcess().Threads.Count;

                    if (allCompleted)
                    {    
                        Console.WriteLine($"All threads done: (Iteration #{i + 1}). Total: {total}, Available: {worker}, {io}\n");
                    }
                    else
                    {
                        Console.WriteLine($"Timed out: (Iteration #{i + 1}). Total: {total}, Available: {worker}, {io}\n");
                    }
                }

                Console.WriteLine("Press any key to exit...");
            }
            catch(Exception ex)
            {
                Console.WriteLine("An exception occurred.");
                Console.WriteLine($"{ex.GetType().Name}: {ex.Message}");
                Console.WriteLine("The program will now exit. Press any key to terminate the program...");
            }

            Console.ReadKey();
        }

        static void ThreadProc1(object state)
        {
            lock(lockA)
            {
                Console.WriteLine("ThreadProc1 entered lockA. Going to acquire lockB");

                lock(lockB)
                {
                    Console.WriteLine("ThreadProc1 acquired both locks: lockA and lockB.");

                    //Do stuff
                    Console.WriteLine("ThreadProc1 running...");
                }
            }

            if (state != null)
            {
                ((AutoResetEvent)state).Set();
            }
        }

        static void ThreadProc2(object state)
        {
            lock(lockB)
            {
                Console.WriteLine("ThreadProc2 entered lockB. Going to acquire lockA.");

                lock(lockA)
                {
                    Console.WriteLine("ThreadProc2 acquired both locks: lockA and lockB.");

                    // Do stuff
                    Console.WriteLine("ThreadProc2 running...");
                }
            }

            if (state != null)
            {
                ((AutoResetEvent)state).Set();
            }
        }
    }
}

Meanwhile, I also kept the Windows Task Manager's Performance tab running and watched the total number of operating system threads go up as my program ate up more threads.

Here is what I observed:

  1. The OS didn't create more threads as the .NET thread pool created a thread every time. In fact, for every four or five iterations that my for loop ran, the OS thread-count would go up by one or two. This was interesting, but this isn't my question. It proves what has already been established.

  2. More interestingly, I observed that the number of threads did not decrease by 2 on every iteration of my for loop. I expected that it should have gone down by 2 because none of my deadlocked threads are expected to return since they are deadlocked, waiting on each other.

  3. I also observed that when the total number of available worker threads in the thread pool went down to zero, the program still kept running more iterations of my for-loop. This made me curious as to where those new threads were coming from if the thread pool had already run out of threads and none of the threads had returned?

enter image description here

enter image description here

So, to clarify, my two question(s), which, perhaps are related in that a single answer may be the explanation to them, are:

  1. When a single iteration of my for-loop ran, for some of those iterations, no thread pool threads were created. Why? And where did the thread pool get the threads to run these iterations on?

  2. Where did the thread pool get the threads from when it ran out of its total number of available worker threads and still kept running my for loop?

3
Maybe Monitor.Enter logic allow the locked thread to be reused. Why not? Thread is waiting anyway, let's run another job in it, etc.Sinatr
That's a nice but weird theory that did enter my mind once. :-)Water Cooler v2

3 Answers

8
votes
    ThreadPool.GetAvailableThreads(out worker, out io);

That's not a great statistic to show you how the thread pool works. Primary problem is that it is ridiculously large number on all recent .NET versions. On my dual-core laptop, it starts out at 1020 in 32-bit mode, 32767 in 64-bit mode. Far, far larger than such an anemic CPU could reasonably handle. This number has significantly inflated over the years, it started out at 50x the number of cores back in .NET 2.0. It is now dynamically calculated based on machine capabilities, the job of the CLR host. It uses a glass that's well over half-full.

The primary job of the threadpool manager is to keep threading efficient. The sweet-spot is to keep the number of executing threads limited to the number of processor cores. Running more reduces perf, the OS then has to context-switch between threads and that adds overhead.

That ideal however cannot always be met, practical tp threads that programmers write are not always well-behaved. In practice they take too long and/or spend too much time blocking on I/O or a lock instead of executing code. Where your example is of course a rather extreme case of blocking.

The thread pool manager is not otherwise aware of exactly why a tp thread takes too long to execute. All it can see is that it takes too long to complete. Getting deep insight into exactly why a thread takes too long is not practical, it takes a debugger and the kind heavily trained massively parallel neural network that programmers have between their ears.

Twice a second, the thread pool manager re-evaluates the work load and allows an extra tp thread to start when none of the active ones complete. Even though that is beyond the optimum. On the theory that this is likely to get more work done since presumably the active ones are blocking too much and not using the available cores efficiently. Also important to solve some deadlock scenarios, albeit that you never want to need that. It is just a regular thread like any other, underlying OS call is CreateThread().

So that's what you see, the number of available threads drops by one twice a second. Independent of your code, this is time-based. There is actually a feedback loop implemented in the manager that tries to dynamically calculate the optimum number of extra threads. You never got there yet with all threads blocking.

This does not go on forever, you ultimately reach the high upper limit set by the default SetMaxThreads(). No exception, assuming you did not hit an OutOfMemoryException first and you'd commonly experience in real life, it just stops adding more threads. You are still adding execution requests to the thread pool, covers bullet 3, they just never actually get started. Eventually you'll run out of memory when the number of requests get too large. You'll have to wait for a long time, takes a while to fill up a gigabyte.

4
votes

The cause is QueueUserWorkItem: "Queues a method for execution. The method executes when a thread pool thread becomes available." https://msdn.microsoft.com/en-us/library/kbf0f1ct(v=vs.110).aspx
In my understanding Threadpool just slowly increases the number of threads to fit your demand, this is what you see in taskmgr. i think you could verify this by adding some things to be done to your thread.
edit: What I mean is, you just queue them, the first threads are starting, and slowly slowly (every 500ms, https://blogs.msdn.microsoft.com/pedram/2007/08/05/dedicated-thread-or-a-threadpool-thread/) more and more threads are added until limits are reached - afterwards you can still queue new ones.

1
votes

The thread pool (almost) never runs out of threads. There's an injection heuristic that adds new threads in an (almost) unbounded way when it thinks this is helping throughput. This also is a guard against deadlocks based on too few threads being available.

This can be a big problem because memory usage is (almost) unbounded.

"Almost" because there is a maximum thread count but that tends to be extremely high in practice (thousands of threads).

When a single iteration of my for-loop ran, for some of those iterations, no thread pool threads were created.

The reason is not apparent to me from the data shown. You probably should measure Process.GetCurrentProcess().ThreadCount after each iteration.

Maybe the deadlock was avoided in some cases? It's not a deterministic deadlock.

On the current CLR running threads appear to be 1:1 with OS threads.

Maybe you should run a simpler benchmark?

        for (int i = 0; i < 10000000; i++)
        {
            Task.Run(() => Thread.Sleep(Timeout.Infinite));

            int workerThreads;
            int cpThreads;
            ThreadPool.GetAvailableThreads(out workerThreads, out cpThreads);

            Console.WriteLine($"Queued: {i}, threads: {Process.GetCurrentProcess().Threads.Count}, workerThreads: {workerThreads}, workerThreads: {cpThreads}");

            Thread.Sleep(100);
        }

enter image description here