a
and b
are both of the int
type, signed int
. It has a length of 32 bit, meaning 4 Byte.
But the enum ENUM
does not need that much.
0000000000000000000000000000000 equals a
0000000000000000000000000000001 equals b
So the creator thought making the ENUM
shorter than int
with a bitfield
of the length of 8 bit, minimum length 1 Byte.
00000000 or 00000001
He could have taken the char
type from the beginning with the length of 1 Byte though.
On some compilers you can activate a feature to ensure an enum can be smaller than int. Using the option --short-enums of GCC, makes it use the smallest type still fitting all the values.
Here is an example how you would save memory with a bitfield.
You see the someBits
struct is smaller than twoInts
struct.
#include "stdio.h"
struct oneInt {
int x;
};
struct twoInts {
int x;
int y;
};
struct someBits {
int x:2; // 2 Bits
int y:6; // 6 Bits
};
int main (int argc, char** argv) {
printf("type int = %lu Bytes\n", sizeof(int));
printf("oneInt = %lu Bytes\n", sizeof(struct oneInt));
printf("twoInts = %lu Bytes\n", sizeof(struct twoInts));
printf("someBits = %lu Bytes\n", sizeof(struct someBits));
return 0;
}
Output:
type int = 4 Bytes
oneInt = 4 Bytes
twoInts = 8 Bytes
someBits = 4 Bytes
enum
to be 8 bits wide within thestruct
. – Kninnug