I've just started to learn DirectCompute and for learning purposes, I want to make a simple convolution filter that blurs a texture. I've written the following compute shader to blur the texture:
Texture2D<float4> inputTex;
RWTexture2D<float4> outputTex;
[numthreads(32, 32, 1)]
void main( uint3 DTid : SV_DispatchThreadID )
{
float4 totalPixVal = float4( 0.0f, 0.0f, 0.0f, 0.0f );
for ( int y = -3; y < 3; ++y )
for ( int x = -3; x < 3; ++x )
totalPixVal += inputTex.Load( uint3(DTid.x + x, DTid.y + y, DTid.z) );
totalPixVal = float4( totalPixVal.xyz / 49.0f, 1.0f );
float4 inPix = inputTex.Load( DTid );
outputTex[DTid.xy] = totalPixVal;
}
Which I call with the following parameters:
DeviceContext->Dispatch( inputTexture->Width / 32, inputTexture->Height / 32, 1 );
This shader takes the average color value of the current pixel and its 48 neighbors, which causes a blurring effect.
Now the blurring part works fine, but the color that the shader outputs does not seem to be correct. To test the shader I'm using a simple image with a pure white shape on a black background. The output I expect is a blurred pure white shape on a black background, but the white shape isn't white anymore, it's grey. Now I've gone over my code several times and I cannot find where the problem is. Each texture has the same size of 512x512 pixels.
This is my input image:
And this is the output I get:
Any help would be greatly appreciated.
Thanks :D