1
votes

When investigating a problem, I stumbled upon code, which boils down to the following example:

const unsigned long long int MAX=9223372036854775807ULL; //2^63-1
void double_it(double *d){
for(unsigned long long int i=0;i<MAX; i++){
       d[i]=2*d[i];      
    }
}

Due to some mistakes, the for loop runs much further than there is memory and the program crashes. But this is not the interesting part.

When compiled with gcc (gcc -O2 -Wall -std=c99 -c), this code leads to the following warning:

warning: iteration 2305843009213693951ull invokes undefined behavior [-Waggressive-loop-optimizations]

which cause I don't understand.

There are some similar questions on stackoverflow, e.g.:

  1. g++ "warning: iteration ... invokes undefined behavior" for Seemingly Unrelated Variable
  2. Why does this loop produce "warning: iteration 3u invokes undefined behavior" and output more than 4 lines?

But those problems were integer overflows, here the counter i is seemingly nowhere near an overflow.

Compiling the same code without -O2 does not lead to such a warning, so I guess -Waggressive-loop-optimizations is an important part.

So actually, I have two questions:

  1. What is the problem with this code (compiled for Linux x86-64)?
  2. Why there is no warning without -O2? If this code is faulty, I would expect it to be faulty no matter whether it is optimized or not.

The behavior is the same for g++ (see it online at coliru).

1
Do you think your computer has around 8388600 terabytes of RAM or more? - ForceBru
Don't use hand-crafted constants. limits.h provides ULLONG_MAX, why not use it? - too honest for this site
"Why there is no warning without -O2? If this code is faulty, I would expect it to be faulty no matter whether it is optimized or not." gcc does not track everything it can when the optimizer is not enabled, probably to speed up compilation of debug builds. This prominently includes uninitialized variables. The code itself would be broken both in debug and release of course, but you'd only get a diagnostic in release. - Baum mit Augen
@ForceBru I know it is not a very practical question... but if I had? - ead
That has nothing to do with "minimal", but with a correct example. You just use the wrong types and (possibly) the wrong limits. You just cannot assume all values of an unsigned long long are valid indexes for an array. - too honest for this site

1 Answers

3
votes

But those problems were integer overflows, here the counter i is seemingly nowhere near an overflow.

Why would you think that?

d[i] is the same as *(d + i). d + i clearly overflows since the size of a double is more than 2 (not exactly sure if that's spelled out anywhere in the standard, but it's a pretty safe assumption that your architecture is like that). To be perfectly correct, sizeof is not entirely related to this, but this is what the code gets turned into internally in the compiler.

In C11 ยง6.5.6 we can read:

If both the pointer operand and the result point to elements of the same array object, or one past the last element of the array object, the evaluation shall not produce an overflow; otherwise, the behavior is undefined

We can reverse the logic of that sentence. If the addition obviously overflows, then it must have been undefined behavior.

The reason why you don't get a warning is that the compiler is under no obligation to give you a warning on all undefined behaviors. It's a courtesy from the compiler. With optimizations the compiler spends more time reasoning about what your code does, so it gets the chance to spot more bad behaviors. Without optimizations it doesn't waste time doing that.