2
votes

I want to estimate noise level of an input image.

I have made some histogram of noisy and original images and compared them, and by looking at histogram in two different stage of an image one can tell which one is noisy and what type of noise is present. (by saying noise, i mean common types of noise like Gaussian, Poisson, Speckle and so on)

I want to know if there is a way to detect noise model and then estimate level of noise (base on specific noise model like std for Gaussian) from image histogram? like identify density function? or maybe this task needs input in other form than spatial domain, like it needs to transform image and then maybe perform the task.

I am using an image with very low changes in pixel values like a gradient and then i apply noises myself to compare histograms of noise-free and noisy images.

Edit: For clearance, i know you can detect noise based on looking at histogram. I am looking for a way that i don't do this "visually" myself. I want to detect noise and maybe density function, after that do something if it is Gaussian or Poisson or... .

I appreciate if anyone can give any hints about what is the right path to solve this problem.

1
First you claim that “ by looking at histogram in two different stage of an image one can tell which one is noisy and what type of noise is present”. Next you ask “I want to know if there is a way to detect noise model [...] from image histogram?” It seems to me that you first statement answers your question. - Cris Luengo
You could check for an even distribution in your histogram, that could imply noise - T A
Cris i am looking for a way to noise beside looking at histogram visually myself - saeidiuum

1 Answers

3
votes

Generally speaking, its impossible to determine the noise distribution by means of analyzing the histogram. Because, its hard to determine whether the variations are due to image texture and lighting variation, or, the noise. Here is a simple example histogram of original and noisy image (Gaussian noise) of beach sand:

comparison of histogram of noisy and unnoisy images

As the image variance is considerable and has a Gaussian distribution itself, the noise does change the histogram slightly. Note that we do not have original image in real world for comparison.
For your case, as described, the original image is smooth, so the variance of image is low. Any noise can increase the variance considerably which is obvious in histogram. so for the basic part of question,

I want to estimate noise level of an input image.

The simplest technique used for estimating the noise of a image is by finding the most smooth part of the image, find histogram of that part and estimate noise distribution of the whole image based on the part. Here is an example of noise estimation using Opencv:

import cv2
import numpy as np
from matplotlib import pyplot as plt

img = cv2.imread('cameraman.bmp',0)
row, col = img.shape
gauss = np.random.normal(10,10,(row,col))
noisy = img + gauss
smooth_part = noisy[:30, :30]

plt.subplot(221),plt.imshow(noisy,cmap = 'gray')
plt.title('Noisy Image'), plt.xticks([]), plt.yticks([])
plt.subplot(222),plt.imshow(smooth_part,cmap = 'gray')
plt.title('Smooth Part'), plt.xticks([]), plt.yticks([])
plt.subplot(223),plt.hist(noisy.ravel(),256,[0,256])#; plt.show()
plt.title('Noisy Image Histogram'), plt.xticks([]), plt.yticks([])
plt.subplot(224),plt.hist(smooth_part.ravel(),256,[0,256])#; plt.show()
plt.title('Estimated Noise Distribution'), plt.xticks([]), plt.yticks([])
plt.show()

and the result:

Estimated Noise