How can I find the the starting point of A array and calculate average starting from starting points to 1 second
A=[0 0 0 0 0 -0.01 -0.2 0.3 0.4 0.5 0 0 0 0 0 0 0.01 0.02 0.03 0.04 0.1 0.2 0.3 0.4 0.7 0.8 1 1.2 1.3 1.4 1.5]
Time=[0 0.1 .2 .3 .4 .5 .6 .7 .8 .9 1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2 2.1 2.2 2.3 2.4 2.5 2.6 2.7 2.8 2.8 3 3.1]
By removing the noise the starting point should be A(17) which is equal to 0.01
Then calculate average of A starting from starting point after 1 seconds