I am trying to implement Gaussian Blur from scratch in c. I have a program that creates a struct and loads a .bmp file, creates a Gaussian blur filter (the first two for-loops below), then sends the struct with height, width, and actual pixels info to another function that applies the filter to the image (the 4-nested for-loops below). After it is applied, it outputs the modified .bmp file. Currently it slightly works, but it's not working all the way. I can't adjust the intensity of the blur (all the levels of blur look the same). It also appears to miss the very outer edge of pixels all around the image.
When I'm creating the filter, I use:
double sigma = 10.0;
double r, s = 2.0 * sigma * sigma;
double sum = 0.0;
But it looks the same whether sigma is 1, 10, 100 - my understanding was that it should get blurrier with bigger sigma.
creating the filter:
// generate 5x5 kernel
for (int x = -2; x <= 2; x++)
{
for(int y = -2; y <= 2; y++)
{
r = sqrt(x*x + y*y);
gKernel[x+2][y+2] = (exp(-(r*r)/s))/(M_PI * s);
sum += gKernel[x + 2][y + 2];
}
}
// normalize the Kernel
for(int i = 0; i < 5; ++i)
for(int j = 0; j < 5; ++j)
gKernel[i][j] /= sum;
and then applying the filter:
for (i = 1; i < bmp->height - 1; i++)
{
for (j = 1; j < bmp->width - 1; j++)
{
sum = 0.0;
for (p = 0; p < 5; p++)
{
for (q = 0; q < 5; q++)
{
sum += bmp->pixels[(i + p) * bmp->width + j + q] * gKernel[p][q];
}
}
bmp->pixels[(i - 1) * (bmp->width) + j] = (unsigned char) sum ;
}
}
Any help much appreciated! Thank you for your time.