The AVR in use is ATmega2560.
I've got an input signal that has a pulse width of 1 second that is generated.
This signal is attached to an external interrupt pin on my AVR (INT0).
INT0 is being initialized as follows:
Code:
DDRD &= ~(1 << PD0);
PORTD |= (1 << PD0);
EIMSK = 1 << INT0; // enable
EICRA |= (1 << ISC00) | (1 << ISC01); // trigger on rising edge
sei(); //global interrupts
The mission of the ISR for this external interrupt is to a) figure out which edge (first case should be rising) and b) perform action based on which edge
The ISR looks something like this:
Code:
ISR(INT0_vect)
{
if(EICRA == 0x02)
{
// falling edge detected
doFallingEdgeFunction_lightLED0();
// quickly change the trigger to capture opposite edge
EICRA |= (1 << ISC00) | (1 << ISC01); // trigger on rising edge
}
else if(EICRA == 0x03)
{
doRisingEdgeFunction_lightLED1();
// change trigger on falling edge
EICRA = (1 << ISC01);
EICRA &= ~(1 << ISC00);
}
}
It is able to detect the edges; the correct LEDs light up, but for some reason changing the edge interrupt bit in the ISR decreases my input signal to 0.1 second width instead of the full 1 second width.
On the scope, I see the original signal being mirrored but with 10x less width! If I remove the "switching trigger" items, the signal is fine.