It is always one in C99, section 6.5.3.4:
When applied to an operand that has
type char, unsigned char, or signed
char, (or a qualified version thereof)
the result is 1.
Edit: not part of your question, but for interest from Harbison and Steele, 3rd ed. (pre c99) p. 148:
A storage unit is taken to be the
amount of storage occupied by one
character; the size of an object of
type char
is therefore 1.
Edit: In answer to your updated question, the following question and answer from Harbison and Steele is relevant (ibid, Ex. 4 of Ch. 6):
Is it allowable to have a C
implementation in which type char
can
represent values ranging from
-2,147,483,648 through 2,147,483,647? If so, what would be sizeof(char)
under that implementation? What would
be the smallest and largest ranges of
type int
?
Answer (ibid, p. 382):
It is permitted (if wasteful) for an
implementation to use 32 bits to
represent type char
. Regardless of
the implementation, the value of
sizeof(char)
is always 1.
While this does not specifically address a case where, say bytes are 8 bits and char
are 4 of those bytes (actually impossible with the c99 definition, see below), the fact that sizeof(char) = 1
always is clear from the c99 standard and Harbison and Steele.
Edit: In fact (this is in response to your upd 2 question), as far as c99 is concerned sizeof(char)
is in bytes, from section 6.5.3.4 again:
The sizeof operator yields the size
(in bytes) of its operand
so combined with the quotation above, bytes of 8 bits and char
as 4 of those bytes is impossible: for c99 a byte is the same as a char
.
In answer to your mention of the possibility of a 7 bit char
: this is not possible in c99. According to section 5.2.4.2.1 of the standard the minimum is 8:
Their implementation-defined values shall be equal or greater [my emphasis] in magnitude to those shown, with the same sign.
— number of bits for smallest object that is not a bit-field (byte)
**CHAR_BIT 8**
— minimum value for an object of type signed char
**SCHAR_MIN -127//−(27−1)**
— maximum value for an object of type signed char
**SCHAR_MAX +127//27−1**
— maximum value for an object of type unsigned char
**UCHAR_MAX 255//28−1**
— minimum value for an object of type char
**CHAR_MIN** see below
— maximum value for an object of type char
**CHAR_MAX** see below
[...]
If the value of an object of type char
is treated as a signed integer when
used in an expression, the value of
CHAR_MIN shall be the same as that of
SCHAR_MIN and the value of CHAR_MAX
shall be the same as that of
SCHAR_MAX. Otherwise, the value of
CHAR_MIN shall be 0 and the value of
CHAR_MAX shall be the same as that of
UCHAR_MAX. The value UCHAR_MAX
shall equal 2^CHAR_BIT − 1.
char
(instead ofwchar
.) Even if the standard says thatsizeof(char)
must be 1, I wouldn't rely on that assumption. – Chip Unisizeof(char)
is always 1, even if char is 32-bits (as happens on some systems). C has lots of fun warts. – Nick Bastinsizeof(char) == sizeof(short) && sizeof(char) == sizeof(int)
on those; I don't remember whethersizeof(int) == sizeof(long)
or whether CHAR_BIT was 32 or 64; I expect it was 32, and I thinksizeof(long) == 1
too. (You can find a reference to, but not online access to, a Cray C manual). – Jonathan Leffler