The MySQL manual and several StackOverflow answers make it clear that varchar uses:
1 bytefor varchars with0-255 charactersin them2 bytesfor varchars withmore than 255 charactersin them.
The first part makes sense. A single byte can store 256 different values, i.e. 0 through 255.
What I am trying to figure out is how MySQL knows how many bytes indicate the length.
Imagine a 255-char varchar starting with the following bytes: [255][w][o][r][d]~
According to the manual, only the first byte is used to indicate the length in this scenario. When reading the field, MySQL will somehow have to know that this is the case here, and that the second byte is not part of the length.
Now imagine a 256-char varchar starting with the following bytes: [255][1][w][o][r][d]~
Now MySQL miraculously knows that it should interpret the first two bytes as the length, when reading the field.
How does it distinguish? The only foolproof way I have come up with is to interpret only the first byte as length, then determine if the text length matches (in its current encoding), and if not, we know that the first two bytes must be the length.