3
votes

I've got a strange problem with my deferred shading implementation. I render the required information via MRT into a FBO, currently diffuse, position and normals in world space, which looks like this:

enter image description here

This is done with the following setup for all tree textures:

diffuse = std::shared_ptr<bb::Texture>(new bb::Texture(GL_TEXTURE_2D)); // generates texture ID
diffuse->bind();
diffuse->texture2D(0, GL_RGB, width, height, 0, GL_RGB, GL_FLOAT, 0); // glTexture2D
diffuse->parameterf(GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
diffuse->parameterf(GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
diffuse->parameterf(GL_TEXTURE_MIN_FILTER, GL_LINEAR);
diffuse->parameterf(GL_TEXTURE_MAG_FILTER, GL_LINEAR);
diffuse->unbind();
texture2D(GL_COLOR_ATTACHMENT0+1, diffuse->getID(), 0);

These are then used in my drawing stage:

dsShader->bind();
dsShader->enableVertexAttribArrays();

ds->diffuse->bind(GL_TEXTURE0); // "ds" is an FBO containing the textures
ds->position->bind(GL_TEXTURE0+1);
ds->normal->bind(GL_TEXTURE0+2);

dsShader->sendUniform("diffuse", 0);
dsShader->sendUniform("position", 1);
dsShader->sendUniform("normal", 2);

dsShader->sendUniform("camera", camera3D->position.x, camera3D->position.y, camera3D->position.z);

dsOut->indexBuffer->bind();
dsOut->vertex2Buffer->bind();
dsOut->vertex2Buffer->vertexAttribPointer(dsShader->getAttribLocation("vertex0"), 2, GL_FLOAT, false, 0, 0);

glDrawElements(GL_TRIANGLES, dsOut->indexBuffer->size(), GL_UNSIGNED_INT, 0);

ds->diffuse->unbind();
ds->position->unbind();
ds->normal->unbind();

dsShader->disableVertexAttribArrays();
dsShader->unbind();

With the following shader (just the necessary part, the light source is hardcoded):

struct DirLight{
    vec3 direction;
    vec4 diffuse, specular;
};

uniform sampler2D diffuse;
uniform sampler2D position;
uniform sampler2D normal;

uniform vec3 camera;

DirLight light0 = DirLight(vec3(1, 1, 0), vec4(0.3), vec4(0.1));

in vec2 vertex;

void main(){
    vec4 color = texture(diffuse, vertex)*0.5;
    vec3 p = vec3(texture(position, vertex));
    vec3 n = normalize(vec3(texture(normal, vertex)));

    float ndotl = max(dot(n, normalize(light0.direction)), 0.0); // simple phong

    if(ndotl > 0.0){
        color += ndotl*light0.diffuse;
    }

    gl_FragColor = color;
}

The wierd part is, if I set the light source direction to negative values, lets say:

DirLight light0 = DirLight(vec3(-1, 0, 0), vec4(0.3), vec4(0.1));

The final result is not shaded. It looks right for positive values (more or less). Here is a picture of the output:

enter image description here

And there could also be a problem with the normals, marked in the red area.

1
It sounds like a problem with your normals. Have you tried rendering the scene, without deferred shading?Gurgadurgen
Yep I tried that, no problem. You can see the output of all tree textures in the first picture. Something I forgot to mention is that my z-axis is the upper axis.Kugel
It looks like your normals are reversed, in the first one. I would suggest trying to fix that, first, then coming back and testing the negative values, again.Gurgadurgen
What is the format of your normal G-Buffer? In your attached screenshot, you have neglected to scale and bias the normals into the visible color range... that is to say, any normal that has a negative value is clipped by GL when displayed on the default framebuffer simply to black. If you are trying to store the normals in a format like GL_RGB then the G-Buffer itself also has this behavior. You can either use something like GL_RGB8_SNORM or do the old * 0.5 + 0.5 to manually scale into [0,1] and * 2.0 - 1.0 trick to scale back to [-1,1].Andon M. Coleman
@AndonM.Coleman Your solution seems to have worked for the OP. You should consider posting it as an answer.Jason C

1 Answers

2
votes

Ok the solution by Andon M. Coleman works, I switched the internal format to GL_RGB8_SNORM. Thanks for your help :)