Note that I'm asking about something that will call a callback function more often than once every 15 ms using something like System.Threading.Timer
. I'm not asking about how to accurately time a piece of code using something like System.Diagnostics.Stopwatch
or even QueryPerformanceCounter
.
Also, I've read the related questions:
Accurate Windows timer? System.Timers.Timer() is limited to 15 msec
Neither of which supplies a useful answer to my question.
In addition, the recommended MSDN article, Implement a Continuously Updating, High-Resolution Time Provider for Windows, is about timing things rather than providing a continuous stream of ticks.
With that said. . .
There's a whole lot of bad information out there about the .NET timer objects. For example, System.Timers.Timer
is billed as "a high performance timer optimized for server applications." And System.Threading.Timer
is somehow considered a second class citizen. The conventional wisdom is that System.Threading.Timer
is a wrapper around Windows Timer Queue Timers and that System.Timers.Timer
is something else entirely.
The reality is much different. System.Timers.Timer
is just a thin component wrapper around System.Threading.Timer
(just use Reflector or ILDASM to peek inside System.Timers.Timer
and you'll see the reference to System.Threading.Timer
), and has some code that will provide automatic thread synchronization so you don't have to do it.
System.Threading.Timer
, as it turns out is not a wrapper for the Timer Queue Timers. At least not in the 2.0 runtime, which was used from .NET 2.0 through .NET 3.5. A few minutes with the Shared Source CLI shows that the runtime implements its own timer queue that is similar to the Timer Queue Timers, but never actually calls the Win32 functions.
It appears that the .NET 4.0 runtime also implements its own timer queue. My test program (see below) provides similar results under .NET 4.0 as it does under .NET 3.5. I've created my own managed wrapper for the Timer Queue Timers and proved that I can get 1 ms resolution (with quite good accuracy), so I consider it unlikely that I'm reading the CLI source wrong.
I have two questions:
First, what causes the runtime's implementation of the timer queue to be so slow? I can't get better than 15 ms resolution, and accuracy seems to be in the range of -1 to +30 ms. That is, if I ask for 24 ms, I'll get ticks anywhere from 23 to 54 ms apart. I suppose I could spend some more time with the CLI source to track down the answer, but thought somebody here might know.
Second, and I realize that this is harder to answer, why not use the Timer Queue Timers? I realize that .NET 1.x had to run on Win9x, which didn't have those APIs, but they've existed since Windows 2000, which if I remember correctly was the minimum requirement for .NET 2.0. Is it because the CLI had to run on non-Windows boxes?
My timers test program:
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Threading;
namespace TimerTest
{
class Program
{
const int TickFrequency = 5;
const int TestDuration = 15000; // 15 seconds
static void Main(string[] args)
{
// Create a list to hold the tick times
// The list is pre-allocated to prevent list resizing
// from slowing down the test.
List<double> tickTimes = new List<double>(2 * TestDuration / TickFrequency);
// Start a stopwatch so we can keep track of how long this takes.
Stopwatch Elapsed = Stopwatch.StartNew();
// Create a timer that saves the elapsed time at each tick
Timer ticker = new Timer((s) =>
{
tickTimes.Add(Elapsed.ElapsedMilliseconds);
}, null, 0, TickFrequency);
// Wait for the test to complete
Thread.Sleep(TestDuration);
// Destroy the timer and stop the stopwatch
ticker.Dispose();
Elapsed.Stop();
// Now let's analyze the results
Console.WriteLine("{0:N0} ticks in {1:N0} milliseconds", tickTimes.Count, Elapsed.ElapsedMilliseconds);
Console.WriteLine("Average tick frequency = {0:N2} ms", (double)Elapsed.ElapsedMilliseconds / tickTimes.Count);
// Compute min and max deviation from requested frequency
double minDiff = double.MaxValue;
double maxDiff = double.MinValue;
for (int i = 1; i < tickTimes.Count; ++i)
{
double diff = (tickTimes[i] - tickTimes[i - 1]) - TickFrequency;
minDiff = Math.Min(diff, minDiff);
maxDiff = Math.Max(diff, maxDiff);
}
Console.WriteLine("min diff = {0:N4} ms", minDiff);
Console.WriteLine("max diff = {0:N4} ms", maxDiff);
Console.WriteLine("Test complete. Press Enter.");
Console.ReadLine();
}
}
}