I'm writing a data logging application (running on a microcontroller) which will write data to ordinary, embedded NOR-type serial flash memory (in this case - an AT25DF161.)
Each packet of data (240 or 496 bytes) will be logged individually to the flash one after another. I figure the most common failure in the flash memory would be a stuck bit - typically "0", the non-erased state. I need to be able to detect single bit events, typically -at most- two per record (I assume this as a worst case after 100,000 write cycles.)
I'm using a processor which has a built in 16-bit CRC calculation module, so there's no performance implication for using less or more terms - so what decisions would I need to make to decide on an optimum polynomial?