3
votes

I need to create a driver for a flash memory chip connected to a STM32 Cortex M3 MCU. The chip is controlled via an SPI bus. I intended to use integrated SPI peripheral of the MCU, but unfortunately it only supports 8- or 16-bit data packets while the flash chip commands are 14 bit long. Thus, I have to implement the protocol from scratch using GPIOs. My question is: what is the right way to ensure correct timings of the signals? I currently think of inserting delays between asserting and deasserting GPIO lines with interrupts disabled, but it seems fairly unreliable to me. Are there any better methods?

3
I would boycott a memory with such an obscure interface. There are plenty of memories available using "standard SPI", as far there is a SPI standard.Lundin
@Lundin, unfortunately, the hardware part was not designed by me, and apparently we will not be able to throw it away now. I agree that a chip with a more convenient interface would have been a better choice.Roman Dmitrienko
Perhaps there are pin-compatible alternatives by other manufacturers? Even if re-designing the hardware isn't an option, you should at the very least ask for a rationale why that particular part was used.Lundin
I guess this just wasn't thought out well. We are already looking for compatible chips :)Roman Dmitrienko
Can you tell us what the flash device is?timrorr

3 Answers

4
votes

Jeb's answer is the preferred method and you should use the hardware SPI if possible, and if DMA is an option that is nice as well.

If you for some reason find out that you cannot use the hardware SPI, but that you must implement it using "bit-banging" over GPIO, you should check what options there are available in the timer/PWM hardware on the MCU. You cannot and should not use blunt "hobbyist burn-away delays" as in the link you posted, the real-time performance will be crap and you will occupy the CPU 100%.

Most MCU timers come with a pin output feature, that would allow a pin to change state when the timer elapses. The pseudo code would then be:

  • Determine if the next bit to send is 1 or 0.
  • Set the MCU polarity register accordingly, so that it will switch the pin to a high or low level.
  • When the timer elapses, you need to set the polarity once again, likely through an interrupt. How to do this is very hardware-dependent.
  • At the same time as you bit-bang the data (MOSI), you also need to generate the clock and chip select. The clock can be generated in the same way as the data, or possibly through a PWM signal if that option is available. Chip select is the easiest part as you only need to pull a pin low during the data transmission.

Finally, there is most likely some application note or official example over how to write a software SPI for your particular MCU.

2
votes

I would recommend to use the build in SPI and DMA if possible!

You could remapping your data into an array of bytes with a size of a multiple of 14bits. So you have to send a multiple of 7*4Bits=28bytes each time.
Then you can use the standard SPI with 8Bit-size.

But this should be much faster with SPI/DMA than bit banging the GPIO's.

0
votes

Some devices that use obscure data lengths are designed so that at the start of a transaction they will either ignore all "0" bits that are clocked in before the first "1", or all "1" bits that are clocked in before the first "0". If your device happens to be designed in such a fashion, you may be able to use 8- or 16-bit SPI mode by clocking out two "junk" bits along with the bits of interest.