I'm working with an ARM cortex M3 board which has an hardware CRC calculation unit. It supports 3 standard CRC polynominals. The interface to the module is very simple. I need to provide a pointer to the data and the data length. The problem I have is that it calculates LSB first. and I need to know the CRC Msb first (xmodem crc16-itt). Is there anyway to take the calculation it got and transform it into msb first?
3 Answers
EDIT/REWRITE:
From the information provided I think you have these choices:
1) arrange the data on the host (which is assumed to not be as resource constrained as a microcontroller) so that the microcontroller does not have to do as much work.
2) Make a (byteswapped) copy of the data using features/instructions where possible to make that faster, and then let the hardware CRC engine compute the crc.
3) Dont use the hardware crc engine, compute the crc using software.
4) Ignore the crc.
5) Use a different microcontroller (that can handle this use case).
[...] I need to know the CRC Msb first (xmodem crc16-itt). Is there anyway to take the calculation it got and transform it into msb first?
The xmodem crc16-itt operates on bytes as data. So Most Significant Byte first or Least Significant Byte first will matter only for the representation of the CRC value. Just swap the computed value - e.g. with __REV16()
CMSIS function.
Edit:
I assumed both sides see 0x01 0x02 0x03 0x04
as such. If one side sees different bytes, e.g. 0x04 0x03 0x02 0x01
in memory then the CRC will fail. But you probably want to fix that sooner than later anyway, as that will give you severe headache when processing the data.