0
votes

I'm using the DMA (described as PDC in the datasheet) of SAM4SD16C with USART 0 peripheral. I've set a timer which generates a interrupt each ms. Each 5 ms a data transfert should be performed via DMA. An other interrupt should occur when TXEMPTY flag is set.

To see when the transmission starts and ends I toggle an Ouptut and watch it on oscilloscope. And then I realized that the end of reception is varying in time by 20 µs (my main clock is 120MHz)... Which in my project is not acceptable. Meanwhile, the start of transmission is 100ns precise, so there is no problem concerning this point.

I'm wondering if there is a way to have a better control on DMA time transfer.

1
Maybe Interrupt or DMA Channel priority?A.R.C.
What do you mean by interrupt ? Other interrupt than TXEMPTY ? I've also tried TXBUFE also which made sense to me and result is the same. Is it possible that my imprecison is due to the USART clock ? I mean, I have 56Kbaud/s baudrate, which result in a period of 17,857 µs. So the frequency of my interrupt can't be more precise than that, am I wrong ?RPerun
Yes Baudrate will have an influence, because the UART Peripheral will wait until the last bit time has passed (or even an additional bus idle time??) until the interrupt triggers. If you don't have any other interrupts or DMA channels activated, priority should be not important.A.R.C.

1 Answers

0
votes

As discussed in comments above, the imprecision of End Of Reception instant is due to baudrate value. This imprecision is around the baudrate period and probably an additional bus idle time.