22
votes

I was just wondering how disastrous integer overflow really is. Take the following example program:

#include <iostream>

int main()
{
    int a = 46341;
    int b = a * a;
    std::cout << "hello world\n";
}

Since a * a overflows on 32 bit platforms, and integer overflow triggers undefined behavior, do I have any guarantees at all that hello world will actually appear on my screen?


I removed the "signed" part from my question based on the following standard quotes:

(§5/5 C++03, §5/4 C++11) If during the evaluation of an expression, the result is not mathematically defined or not in the range of representable values for its type, the behavior is undefined.

(§3.9.1/4) Unsigned integers, declared unsigned, shall obey the laws of arithmetic modulo 2^n where n is the number of bits in the value representation of that particular size of integer. This implies that unsigned arithmetic does not overflow because a result that cannot be represented by the resulting unsigned integer type is reduced modulo the number that is one greater than the largest value that can be represented by the resulting unsigned integer type.

3
I've never seen an overflow in C++ cause an issue. The number will just happily wrap around and set the overflow flag on the processor. - Brain2000
@Brain2000 They mostly cause optimization issues. Such as (a+1 > a) being always true despite overflow. - Pubby
possible duplicate of GCC Fail? Or Undefined Behavior? - Xeo
@Xeo, not really a duplicate, just an example where overflow had an unexpected result. - AProgrammer
@LokiAstari: The C++ standard says "If during the evaluation of an expression, the result is not mathematically defined or not in the range of representable values for its type, the behavior is undefined." - fredoverflow

3 Answers

21
votes

As pointed out by @Xeo in the comments (I actually brought it up in the C++ chat first):
Undefined behavior really means it and it can hit you when you least expect it.

The best example of this is here: Why does integer overflow on x86 with GCC cause an infinite loop?

On x86, signed integer overflow is just a simple wrap-around. So normally, you'd expect the same thing to happen in C or C++. However, the compiler can intervene - and use undefined behavior as an opportunity to optimize.

In the example taken from that question:

#include <iostream>
using namespace std;

int main(){
    int i = 0x10000000;

    int c = 0;
    do{
        c++;
        i += i;
        cout << i << endl;
    }while (i > 0);

    cout << c << endl;
    return 0;
}

When compiled with GCC, GCC optimizes out the loop test and makes this an infinite loop.

8
votes

You may trigger some hardware safety feature. So no, you don't have any guarantee.

Edit: Note that gcc has the -ftrapv option (but it doesn't seem to work for me).

5
votes

There are two views about undefined behavior. There is the view it is there to gather for strange hardware and other special cases, but that usually it should behave sanely. And there is the view that anything can happen. And depending on the UB source, some hold different opinions.

While the UB about overflow has probably been introduced for taking into account hardware which trap or saturate on overflow and the difference of result between representation, and so one can argue for the first view in this case, people writing optimizers hold very dearly the view that if the standard doesn't guarantee something, really anything can happen and they try to use every piece of liberty to generate machine code which runs more rapidly, even if the result doesn't make sense anymore.

So when you see an undefined behavior, assume that anything can happen, however reasonable a given behavior may seem.