I want to write signed integer values into a file in a platform independent way.
If they were unsigned, I would just convert them from host byte order to LE (or BE) with the endian(3) family of functions.
I'm not sure how to deal with signed integers though. If I cast them to unsigned values, I loose the sign, since the C standard does not guarantee that
(int) ((unsigned) -1)) == -1
The other option would be to I cast a pointer to the value (i.e., reinterpret the byte sequence as unsigned), but it I'm not convinced that converting endianness after that is going to give anything sensible.
What is the proper way for platform independent signed integer storage?
Update:
I know that in practice, almost all architectures use two-complement representation, so that I can losslessly convert between signed and unsigned integers. However, this is question is meant to be more theoretical.
Just rolling out my own integer representation (be that storing the decimal letters as ascii characters, or separately storing the sign bit) is of course a solution. However, I'm interested if there is a way that works without completely abandoning the native binary representation.
htonl()
andntohl()
– Brian Roachhtonl
andntohl
are really the same thing asendian(3)
, and the problem with those functions is described in the question. – Nikratio