I reproduce the problem by this simple demo:
// bool_test_func.cpp
#include <stdio.h>
void func(bool* b) {
int a = (*b ? 0 : 1);
printf("%d\n", a); // EXPECT ether 0 or 1 here
}
// bool_test.cpp
void func(bool* b);
int main() {
int n = 128;
func((bool*)&n);
return 0;
}
-O0 compile and run:
g++ -g -O0 -Wall -o bool_test bool_test.cpp bool_test_func.cpp
mikewei@maclinux:~/testing/c++$ ./bool_test
0
-O1 compile and run (unexpected result):
g++ -g -O1 -Wall -o bool_test bool_test.cpp bool_test_func.cpp
mikewei@maclinux:~/testing/c++$ ./bool_test
129
When I check the -O2 ASM code, I think it is a g++ bug, g++'s optimzation code always think the bool value is ether 1 or 0:
00000000004005e6 : 4005e6: 48 83 ec 08 sub $0x8,%rsp 4005ea: 0f b6 37 movzbl (%rdi),%esi 4005ed: 83 f6 01 xor $0x1,%esi #just XOR the bool val 4005f0: 40 0f b6 f6 movzbl %sil,%esi 4005f4: bf 94 06 40 00 mov $0x400694,%edi 4005f9: b8 00 00 00 00 mov $0x0,%eax 4005fe: e8 9d fe ff ff callq 4004a0 400603: 48 83 c4 08 add $0x8,%rsp 400607: c3 retq 400608: 0f 1f 84 00 00 00 00 nopl 0x0(%rax,%rax,1) 40060f: 00
gcc version 4.9.2 (Debian 4.9.2-10)
Is this g++ behavior by design? How can I disable this wrong optimaztion? Thanks~
#include
. – eripint
type-punned as abool
, undefined behaviour ensues. – Quentin