I have an actually very easy question about the IEEE-754 standard in which numbers are coded and saved on the computer.
At uni (exams) I have come across the following definition for 16-bit IEEE-754-format (half precision): 1 sign bit, 6 exponent bits & 9 mantissa bits.
An internet search (or books) reveal another definition: 1 sign bit, 5 exponent bits & 10 mantissa bits
The reason why I’m asking is that I cannot believe the uni might have made such a simple mistake, so are there multiple definitions for numbers given in 16-bit IEEE-754 format?