I was wondering why both utf-16le and utf-16be exists? Is it considered to be "inefficient" for a big-endian environment to process a little-endian data?
Currently, this is what I use while storing 2 bytes var locally:
unsigned char octets[2];
short int shotint = 12345; /* (assuming short int = 2 bytes) */
octets[0] = (shortint) & 255;
octets[1] = (shortint >> 8) & 255);
I know that while storing and reading as a fixed endianness locally - there is no endian risk. I was wondering if it's considered to be "inefficient"? what would be the most "efficient" way to store a 2 bytes var? (while restricting the data to the environment's endianness, local use only.)
Thanks, Doori Bar