I am writing a low level driver for a type of one line communication protocol. This line is connected to both Tx pin and Rx pin on a STM32F0 micro running internal clock at 8Mhz. The Tx pin state is set in a timer interrupt, and the Rx pin is read in a external GPIO interrupt.
For testing, I toggle the Tx pin at 416µs (auto reload value is 3333 with no prescaler), and in the GPIO interrupt I read the timing difference between 2 consecutive interrupts. The measured time are roughly 500µs from "High To Low" transition interrupt to "Low To High" transition interrupt and 300µs from "Low To High" transition interrupt to "High To Low" transition interrupt. Why is there such a difference? And how to get rid of it?
I have checked the signal on the scope and it's a perfect square wave with pulse width of 416µs. I also use htim->Instance->CNT = 0;
and time = htim->Instance->CNT;
to wrap different parts of the code to find where the difference comes from but no avail.
Here are the interrupt handles, the measured time is saved in tim3_value
variable:
void TIM2_IRQHandler(void)
{
if (__HAL_TIM_GET_FLAG(&htim2, TIM_FLAG_UPDATE) != RESET)
{
if (__HAL_TIM_GET_IT_SOURCE(&htim2, TIM_IT_UPDATE) != RESET)
{
__HAL_TIM_CLEAR_FLAG(&htim2, TIM_FLAG_UPDATE);
HAL_GPIO_TogglePin(TX_GPIO_Port, TX_Pin);
htim2.Instance->ARR = 3333;
}
}
return;
}
void EXTI4_15_IRQHandler(void)
{
if(__HAL_GPIO_EXTI_GET_IT(RX_Pin) != 0x00u)
{
__HAL_GPIO_EXTI_CLEAR_IT(RX_Pin);
tim3_value = htim3.Instance->CNT;
htim3.Instance->CNT = 0;
}
return;
}