I have just read a question on how to calculate Signal to Noise Ratio (SNR) here, and I was just wondering how I can relate SNR to the image quality. More specifically, I want to know what kind of SNR level can we tell that the image is of bad quality.
2 Answers
There are several image quality metrics - most closely related to SNR is PSNR (peak SNR). You can read more about it here.
Of interest to your question:
"Typical values for the PSNR in lossy image and video compression are between 30 and 50 dB, where higher is better. Acceptable values for wireless transmission quality loss are considered to be about 20 dB to 25 dB."
In general though, PSNR is only an approximation to human perception of image quality.
The signal-to-noise ratio (SNR) is used in imaging as a physical measure of the sensitivity of a (digital or film) imaging system. Industry standards measure SNR in decibels (dB) of power and therefore apply the 10 log rule to the "pure" SNR ratio (a ratio of 1:1 yields 0 decibels, for instance). In turn, yielding the "sensitivity." Industry standards measure and define sensitivity in terms of the ISO film speed equivalent;
- SNR:32.04 dB = excellent image quality
- SNR:20 dB = acceptable image quality
Find Details at wikipedia