73
votes

Today my application threw an OutOfMemoryException. To me this was always almost impossible since I have 4GB RAM and a lot of virtual memory too. The error happened when I tried to add an existing collection to a new list.

List<Vehicle> vList = new List<Vehicle>(selectedVehicles);  

To my understanding there isn't much memory allocated here since the vehicles my new list should contain already exist inside the memory. I have to admit Vehicle is a very complex class and I tried to add about 50.000 items to the new list at once. But since all Vehicles in the application come from a database that is only 200MB in size, I have no idea what may cause an OutOfMemoryException at this point.

10
What is the value (and type) of selectedVehicles?harold
When the OutOfMemoryException was thrown, did you attach to the process with a debugger and see what the problem might be? How big were the objects? The .NET Framework has a hard limit of 2 GB for object size, minus the overhead consumed by the framework itself.Cody Gray♦
Is Vehicle possibly a struct instead of a class?Anthony Pegram
200MB of database space may easily take up more than twice that much when converting to .net objects. Afterward, it may be a smaller footprint, but the framework is trying to grab a large, contiguous chunk of memory at once that is not available.StingyJack
Your statement about how much memory you have the amount of virutal memory you have allowed your system to have shows a lack of understanding how virtual memory and phyiscal memory work. You might want to read some information on that subject.Security Hound

10 Answers

84
votes

Two points:

  1. If you are running a 32 bit Windows, you won't have all the 4GB accessible, only 2GB.
  2. Don't forget that the underlying implementation of List is an array. If your memory is heavily fragmented, there may not be enough contiguous space to allocate your List, even though in total you have plenty of free memory.
94
votes

3 years old topic, but I found another working solution. If you're sure you have enough free memory, running 64 bit OS and still getting exceptions, go to Project properties -> Build tab and be sure to set x64 as a Platform target.

enter image description here

74
votes

.Net4.5 does not have a 2GB limitation for objects any more. Add this lines to App.config

<runtime>
    <gcAllowVeryLargeObjects enabled="true" />    
</runtime>

and it will be possible to create very large objects without getting OutOfMemoryException

Please note it will work only on x64 OS's!

14
votes

Data stored in database compared to memory in your application is very different.

There isn't a way to get the exact size of your object but you could do this:

GC.GetTotalMemory() 

After a certain amount of objects have been loaded and see how much your memory is changing as you load the list.

If it is the list that is causing the excessive memory usage then we can look at ways to minimize it. Such as why do you want 50,000 objects loaded into memory all at once in the first place. Wouldn't it be best to call the DB as you require them?

If you take a look here: http://www.dotnetperls.com/array-memory you will also see that objects in .NET are greater than their actual data. A generic list is even more of a memory hog than an array. If you have a generic list inside your object then it will grow even faster.

8
votes

OutOfMemoryException (on 32-bit machines) is just as often about Fragmentation as actual hard limits on memory - you'll find lots about this, but here's my first google hit briefly discussing it: http://blogs.msdn.com/b/joshwil/archive/2005/08/10/450202.aspx. (@Anthony Pegram is referring to the same problem in his comment above).

That said, there is one other possibility that comes to mind for your code above: As you're using the "IEnumerable" constructor to the List, you may not giving the object any hints as to the size of the collection you're passing to the List constructor. If the object you are passing is is not a collection (does not implement the ICollection interface), then behind-the-scenes the List implementation is going to need to grow several (or many) times, each time leaving behind a too-small array that needs to be garbage collected. The garbage collector probably won't get to those discarded arrays fast enough, and you'll get your error.

The simplest fix for this would be to use the List(int capacity) constructor to tell the framework what backing array size to allocate (even if you're estimating and just guessing "50000" for example), and then use the AddRange(IEnumerable collection) method to actually populate your list.

So, simplest "Fix" if I'm right: replace

List<Vehicle> vList = new List<Vehicle>(selectedVehicles);

with

List<Vehicle> vList = new List<Vehicle>(50000);  
vList.AddRange(selectedVehicles);

All the other comments and answers still apply in terms of overall design decisions - but this might be a quick fix.

Note (as @Alex commented below), this is only an issue if selectedVehicles is not an ICollection.

5
votes

My Development Team resolved this situation:

We added the following Post-Build script into the .exe project and compiled again, setting the target to x86 and increasing by 1.5 gb and also x64 Platform target increasing memory using 3.2 gb. Our application is 32 bit.

Related URLs:

Script:

if exist "$(DevEnvDir)..\tools\vsvars32.bat" (
    call "$(DevEnvDir)..\tools\vsvars32.bat"
    editbin /largeaddressaware "$(TargetPath)"
)
3
votes

You should not try to bring all the list at once, te size of the elements in the database is not the same that the one it takes into memory. If you want to process the elements you should use a for each loop and take advantage of entity framework lazy loading so you dont bring all the elements into memory at once. In case you want to show the list use pagination (.Skip() and .take() )

3
votes

I know this is an old question but since none of the answers mentioned the large object heap, this might be of use to others who find this question ...

Any memory allocation in .NET that is over 85,000 bytes comes from the large object heap (LOH) not the normal small object heap. Why does this matter? Because the large object heap is not compacted. Which means that the large object heap gets fragmented and in my experience this inevitably leads to out of memory errors.

In the original question the list has 50,000 items in it. Internally a list uses an array, and assuming 32 bit that requires 50,000 x 4bytes = 200,000 bytes (or double that if 64 bit). So that memory allocation is coming from the large object heap.

So what can you do about it?

If you are using a .net version prior to 4.5.1 then about all you can do about it is to be aware of the problem and try to avoid it. So, in this instance, instead of having a list of vehicles you could have a list of lists of vehicles, provided no list ever had more than about 18,000 elements in it. That can lead to some ugly code, but it is viable work around.

If you are using .net 4.5.1 or later then the behaviour of the garbage collector has changed subtly. If you add the following line where you are about to make large memory allocations:

System.Runtime.GCSettings.LargeObjectHeapCompactionMode = System.Runtime.GCLargeObjectHeapCompactionMode.CompactOnce;

it will force the garbage collector to compact the large object heap - next time only.

It might not be the best solution but the following has worked for me:

int tries = 0;
while (tries++ < 2)
{
  try 
  {
    . . some large allocation . .
    return;
  }
  catch (System.OutOfMemoryException)
  {
    System.Runtime.GCSettings.LargeObjectHeapCompactionMode = System.Runtime.GCLargeObjectHeapCompactionMode.CompactOnce;
    GC.Collect();
  }
}

of course this only helps if you have the physical (or virtual) memory available.

3
votes

As .Net progresses, so does their ability to add new 32-bit configurations that trips everyone up it seems.

If you are on .Net Framework 4.7.2 do the following:

Go to Project Properties

Build

Uncheck 'prefer 32-bit'

Cheers!

0
votes

While the GC compacts the small object heap as part of an optimization strategy to eliminate memory holes, the GC never compacts the large object heap for performance reasons**(the cost of compaction is too high for large objects (greater than 85KB in size))**. Hence if you are running a program that uses many large objects in an x86 system, you might encounter OutOfMemory exceptions. If you are running that program in an x64 system, you might have a fragmented heap.