So what I know about CRC, and also did a Java implementation is this:
Having an initial message as a 16-bit polynomial, for instance
0x0617
65
0000.0110.0001.0111
this one gets another 16 of 0 bits
0000.0110.0001.0111|0000.0000.0000.0000
Then, having the divisor,
0x1021
0001.0000.0010.0001 (0, 5, 12)
We allign it at the start of each "1" in our initial message, and do XOR between the bits, until there are no more 1s in the initial message. In total, there will be 6 XORs in our example.
The CRC will be the last 16 bits of our message, or the remainder of the division. In this case, 101011110110000
.
My question is, how can I implement this with a look-up table? Can someone show me a numerical example on my polynomial on how it is calculated?
0x06
is exclusive-ored with the CRC (which end depends on whether your polynomial is reflected or not), and then that byte is used to lookup a 16-bit value in a table. The CRC is shifted by eight (direction dependent on reflection), and then exclusive-ored with the table value. That is then repeated for0x17
. – Mark Adler