Endianness from what I understand, is when the bytes that compose a multibyte word differ in their order, at least in the most typical case. So that an 16-bit integer may be stored as either 0xHHLL or 0xLLHH.
Assuming I don't have that wrong, what I would like to know is when does Endianness become a major factor when sending information between two computers where the Endian may or may not be different.
If I transmit a short integer of 1, in the form of a char array and with no correction, is it received and interpretted as 256?
If I decompose and recompose the short integer using the following code, will endianness no longer be a factor?
// Sender: for(n=0, n < sizeof(uint16)*8; ++n) { stl_bitset[n] = (value >> n) & 1; }; // Receiver: for(n=0, n < sizeof(uint16)*8; ++n) { value |= uint16(stl_bitset[n] & 1) << n; };- Is there a standard way of compensating for endianness?
Thanks in advance!
00000001or as10000000;-) - Kerrek SB0xHHLLand the like I don't think it is a good way to represent the concept because0x...is a construct at the language syntax level and endianness is at the memory organization level. That is0xFF12is0xFF12on machines of every endianness. The usual construct is to use hex-editor type output or draw memory as a array of boxes with values in them. - dmckee --- ex-moderator kitten