2
votes

I am trying to allocate an real array finn_var(459,299,27,24,nspec) in Fortran. nspec = 24 works ok, while nspec = 25 not. No error message for allocation process, but print command print empty rather than zero values. If you use the array, there will be "segmentation fault" error message. The test program is

  program test

  implicit none

  integer                    :: nx, ny, nez, nt, nspec
  integer                    :: allocation_status
  real , allocatable         :: finn_var(:,:,:,:,:)

  nx    = 459
  ny    = 299
  nez   = 27
  nt    = 24
  nspec = 24

  allocate( finn_var(nx, ny, nez, nt, nspec), stat = allocation_status )
  if (allocation_status > 0) then
    print*, "Allocation error for finn_var"
    stop
  end if

  print*, finn_var

  end

Should not be the memory issue. I allocated double precision finn_var(459,299,27,24,24) without problem. What is the reason then?

I use pgf90 on linux server. the cat /proc/meminfo command:

MemTotal:     396191724 kB
MemFree:      66065188 kB
Buffers:        402388 kB
Cached:       274584600 kB
SwapCached:          0 kB
Active:       131679328 kB
Inactive:     191625200 kB
HighTotal:           0 kB
HighFree:            0 kB
LowTotal:     396191724 kB
LowFree:      66065188 kB
SwapTotal:    20971484 kB
SwapFree:     20971180 kB
Dirty:          605508 kB
Writeback:           0 kB
AnonPages:    48317148 kB
Mapped:         123328 kB
Slab:          6612824 kB
PageTables:     132920 kB
NFS_Unstable:        0 kB
Bounce:              0 kB
CommitLimit:  219067344 kB
Committed_AS: 53206972 kB
VmallocTotal: 34359738367 kB
VmallocUsed:    275624 kB
VmallocChunk: 34359462559 kB
HugePages_Total:     0
HugePages_Free:      0
HugePages_Rsvd:      0
Hugepagesize:     2048 kB

the unlimit -a command:

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 3153920
max locked memory       (kbytes, -l) 32
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 10240
cpu time               (seconds, -t) unlimited
max user processes              (-u) 3153920
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

I compiled by pgf90. But if I compiled by gfortran, there is no problem.

2
Works fine here using gfortran 5.2 on a Mac. - Simon
the ‑Mlarge_arrays pgf90 compiler flag solves. - apple

2 Answers

4
votes

It doesn't have to be insufficient memory. The size of the array is 2 223 304 200. That is suspiciously close to the maximum 32bit integer 2 147 483 648.

It looks like that the element count that the compiler uses internally overflows. The internal call to malloc requests not enough memory and then any attempt to read some of the elements at the end fails.

It is a limitation of the compiler in its default settings. It can be set-up to use 64bit addressing by using the option ‑Mlarge_arrays.

See http://www.pgroup.com/products/freepgi/freepgi_ref/ch05.html#ArryIndex

2
votes

Your problem is most likely a memory issue.

You array demands 459*299*27*24 * 4B per nspec (assuming default real requires 4B of memory). For nspec == 24 this results in a memory requirement of approximately 7.95GiB, while nspec == 25 needs around 8.28GiB.

I guess, your physical memory is limited to 8GiB or some ulimit limits the amount of allowed memory for this process.