I'm working with a Fortran90 code that works just fine for small arrays. Unfortunately, if the arrays are too large, the program returns an error when allocating the array: severe (41): Insufficient virtual memory.
What I have is:
real*8, allocatable :: M(:,:), C(:,:), K(:,:)
real*8, allocatable :: t(:)
real*8, allocatable :: X(:,:), DX(:,:), D2X(:,:)
real*8, allocatable :: R(:,:)
integer*4 :: nx, nt
integer :: io_stat
...
nx = 15
nt = 6000001
...
allocate(M(nx,nx), C(nx,nx), K(nx,nx),stat=io_stat)
if(io_stat) stop
allocate(t(nt),stat=io_stat)
if(io_stat) stop
allocate(X(nx,nt),DX(nx,nt),D2X(nx,nt),R(nx,nt),stat=io_stat)
if(io_stat) stop
...
The nt parameter is calculated based on the time increment I want to use on the calculations, and this time increment must be very small (1e-9 or smaller) to make the results converge. If nt=600001, the program runs just fine, but the results are not accurate. The problem is: I can't increase the size of the arrays because the program can't allocate all the arrays I'm using (when trying to allocate the DX array, the program already stops). Those are not the only arrays on the code, just the significant (big) arrays.
I'm using Intel(R) Visual Fortran Compiler Professional Edition 11.1 and Microsoft Visual Studio 2008 Version 9.0.21022.8 RTM. I'm running the code in Windows 8, 64-bit, 16 GB RAM memory. However, Build -> Configuration Manager is set to Win32 Platform, because I use a library that must run on this platform (I can't change it to a x64 platform).
I would like to know how I can increase the virtual memory, so I can run my code with the proper discretization, i.e., I need more memory to allocate all my arrays. How to do it?
P.S. I runned a similar code on Matlab, and it generated almost 2GB of data (of the X, DX and D2X arrays, which are the results I want). But I have 16GB of ram, so I think memory of the computer is not the problem, but something with the compiler/code. Am I right?