A followup to Does .NET JIT optimize empty loops away?:
The following program just runs an empty loop a billion times and prints out the time to run. It takes 700 ms on my machine, and I'm curious if there's a way to get the jitter to optimize away the empty loop.
using System;
namespace ConsoleApplication1 {
class Program {
static void Main() {
var start = DateTime.Now;
for (var i = 0; i < 1000000000; i++) {}
Console.WriteLine((DateTime.Now - start).TotalMilliseconds);
}
}
}
As far as I can tell the answer is no, but I don't know if there are hidden compiler options I might not have tried. I have made sure to compile in release mode and run with no debugger attached, but still 700 ms is being taken to run this empty loop. I also tried NGEN with the same result (though my understanding is that it should produce the same compiled code as the JIT anyway, right?). However I've never used NGEN before and may be using it wrong.
It seems like this would be something easy for the JIT to find and optimize away, but knowing very little about how jitters work in general, I'm curious if there's a specific reason this optimization would have been left out. Also the VC++ compiler definitely does seem to make this optimization, so I wonder why the discrepancy. Any ideas?