2
votes

I have a mystery that I do not have an answer for. I have written a simple program in C++(and I should say that I'm not a professional C++ developer). Here it is:

#include <iostream>

int main(){
  const int SIZE = 1000;
  int pool1 [SIZE];
  int pool2 [SIZE];
  int result [SIZE*SIZE];

  //Prepare data
  for(int i = 0; i < SIZE; i++){
    pool1[i] = i + 1;
    pool2[i] = SIZE - i;
  }

  //Run test
  for(int i = 0; i < SIZE; i++){
    for(int j = 0; j < SIZE; j++){
      result[i*SIZE + j] = pool1[i]*pool2[j];
    }
  }

  return 0; 
}

The program seems to work (I use it as a kind of benchmark for different languages) but then I ran it with valgrind and it started complaing:

==25912== Invalid read of size 4
==25912==    at 0x804864B: main (in /home/renra/Dev/Benchmarks/array_iteration/array_iteration_cpp)
==25912==  Address 0xbee79da0 is on thread 1's stack

==25912== Invalid write of size 4
==25912==    at 0x8048632: main (in /home/renra/Dev/Benchmarks/array_iteration/array_iteration_cpp)
==25912==  Address 0xbeaa9498 is on thread 1's stack

==25912== More than 10000000 total errors detected.  I'm not reporting any more.
==25912== Final error counts will be inaccurate.  Go fix your program!

Hmm, does not look good. Size 4 probably refers to the size of int. As you can see at first I was using SIZE 1000 so the results array would be 1,000,000 ints long. So, I thought, it was just overflowing and I needed a larger value type(at least for the iterators and the array of results). I used unsigned long long (the max of unsigned long is 18,446,744,073,709,551,615 and all I needed was 1,000,000 - SIZE*SIZE ). But I'm still getting these error messages (and they still say the read and write size is 4 even though sizeof(long long) is 8).

Also the messages are not there when I use a lower SIZE, but they seem to kick in exactly at SIZE 707 regardless of the used type. Anybody has a clue? I'm quite curious :-).

1
You should use valgrind with your program compiled in Debug mode in order to obtain precise line number of errors. - HAL9000
An 1000000 element array on the stack is just a bad idea, don't do that. - Jens Gustedt
Do you guys have some kind of rule of thumb as to how much data is still good to keep on the stack and when is it already too much? - Renra
@Renra - a typical default stack is a few megabytes. My rule of thumb is anything < 1K is fine, 1K-10K is not recommended, 10K-100K is dangerous, >100K don't do it. Another metric you could use is cache size. If the object you have is bigger than a cache size (say 4K), you won't necessarily get a speed advantage on the stack compared to the heap. - Mark Lakata

1 Answers

4
votes

C and C++ both have no clear limit on what sizes of arrays you will be able to use on the stack and also usually no builtin protection. Just don't allocate such large chunks as automatic (scope local) variables. Use malloc in C or new in C++ for such a purpose.