I am having difficulty in understanding the logic behind generating a plot of SNR (db) vs MSE. Different Signal to Noise Ratio (SNR) is created by varying the noise power . The formula of MSE is averaged over T
independent runs.
For each SNR
, I generate NEval = 10 time series
. How do I correctly plot a graph of SNR vs MSE when SNR is in the range = [0:5:50]? Below is the pseudo code.
N = 100; %Number_data_points
NEval = 10; %Number_of_different_Signals
Snr = [0:5:50];
T = 1000; %Number of independent runs
MSE = [1];
for I = 1:T
for snr = 1: length(Snr)
for expt = 1:NEval
%generate signal
w0=0.001; phi=rand(1);
signal = sin(2*pi*[1:N]*w0+phi);
% add zero mean Gaussian noise
noisy_signal = awgn(signal,Snr(snr),'measured');
% Call Estimation algorithm
%Calculate error
end
end
end
plot(Snr,MSE); %Where and how do I calculate this MSE