Why do people think that an memory leak in .NET is not the same as any other leak?
A memory leak is when you attach to a resource and do not let it go. You can do this both in managed and in unmanaged coding.
Regarding .NET, and other programming tools, there have been ideas about garbage collecting, and other ways of minimizing situations that will make your application leak.
But the best method of preventing memory leaks is that you need to understand your underlying memory model, and how things works, on the platform you are using.
Believing that GC and other magic will clean up your mess is the short way to memory leaks, and will be difficult to find later.
When coding unmanaged, you normally make sure to clean up, you know that the resources you take hold of, will be your responsibility to clean up, not the janitor's.
In .NET on the other hand, a lot of people think that the GC will clean up everything. Well, it does some for you, but you need to make sure that it is so. .NET does wrap lots of things, so you do not always know if you are dealing with a managed or unmanaged resource, and you need to make sure what what you're dealing with. Handling fonts, GDI resources, active directory, databases etc is typically things you need to look out for.
In managed terms I will put my neck on
the line to say it does go away once
the process is killed/removed.
I see lots of people have this though, and I really hope this will end. You cannot ask the user to terminate your app to clean up your mess!
Take a look at a browser, that can be IE, FF etc, then open, say, Google Reader, let it stay for some days, and look at what happens.
If you then open another tab in the browser, surf to some site, then close the tab that hosted the other page that made the browser leak, do you think the browser will release the memory? Not so with IE. On my computer IE will easily eat 1 GiB of memory in a short amount of time (about 3-4 days) if I use Google Reader. Some newspages are even worse.