7
votes

Lets compare two pieces of code:

String str = null;
//Possibly do something...
str = "Test";
Console.WriteLine(str);

and

String str;
//Possibly do something...
str = "Test";
Console.WriteLine(str);

I was always thinking that these pieces of code are equal. But after I have build these code (Release mode with optimization checked) and compared IL methods generated I have noticed that there are two more IL instructions in the first sample:

1st sample code IL:

.maxstack 1
.locals init ([0] string str)
IL_0000: ldnull
IL_0001: stloc.0
IL_0002: ldstr "Test"
IL_0007: stloc.0
IL_0008: ldloc.0
IL_0009: call void [mscorlib]System.Console::WriteLine(string)
IL_000e: ret

2nd sample code IL:

.maxstack 1
.locals init ([0] string str)
IL_0000: ldstr "Test"
IL_0005: stloc.0
IL_0006: ldloc.0
IL_0007: call void [mscorlib]System.Console::WriteLine(string)
IL_000c: ret

Possibly this code is optimized by JIT compiller? So does the initialization of local bethod variable with null impacts the performence (I understand that it is very simple operation but any case) and we should avoid it? Thanks beforehand.

3
It's generally considered bad form to initialize with a value that will never be used, simply because it adds confusion (there is a 'null' being assigned that has no meaning to the logic.) Any associated performance hit is negligible; the former reason is a much more compelling reason to avoid it.Dan Bryant
@Dan Bryany: Thanks for the comment. I agree with you that we should not use null for initialization but some developers prefer to use it to show initialization directly. I have such one in my team :)petro.sidlovskyy
Generally I prefer not to even declare the variable until it's being initialized, as this further limits the conceptual scope when trying to understand the code (i.e. locals are as local as possible to where they are being used.) A side effect of this approach is that, as a method grows, the placement of locals has a tendency to highlight areas of the code where new methods can be extracted.Dan Bryant
@Jason: That's not a very good example - 'line' could definitely be scoped inside the loop body.nobody
@Andrew Medico: You're too kind, it's a terrible example and that was clearly retarded of me. Here's a better example: string value; if(!dictionary.TryGetValue(key, out value)) { // something }.jason

3 Answers

8
votes

http://www.codinghorror.com/blog/2005/07/for-best-results-dont-initialize-variables.html

To summarize from the article, after running various benchmarks, initializing an object to a value (either as part of a definition, in the class' constructor, or as part of an initialization method) can be anywhere from roughly 10-35% slower on .NET 1.1 and 2.0. Newer compilers may optimize away initialization on definition. The article closes by recommending to avoid initialization as a general rule.

6
votes

It is slightly slower, as Jon.Stromer.Galley's link points out. But the difference is amazingly small; likely on the order of nanoseconds. At that level, the overhead from using a high-level language like C# dwarfs any performance difference. If performance is that much of an issue, you may as well be coding in C or ASM or something.

The value of writing clear code (whatever that means to you) will far outweigh the 0.00001ms performance increase in terms of cost vs. benefit. That's why C# and other high-level languages exist in the first place.

I get that this is probably meant as an academic question, and I don't discount the value of understanding the internals of the CLR. But in this case, it just seems like the wrong thing to focus on.

2
votes

Today (2019) both the .NET Framework and the .NET Core compilers are smart enough to optimize unneeded initializations away. (Along with the useless stloc.0 - ldloc.0 pair.)

Both versions compile as

        .maxstack 8

        ldstr "Test"
        call void [System.Console]System.Console::WriteLine(string)
        ret

See my SharpLab experiment as reference.

But of course implementations change, but Justin's answer is timeless: I did this experiment out of curiosity, in a real situation focus on code clarity and expressiveness, and ignore micro-optimizations.