1
votes

I'm trying to create a simple 3d game engine. I can load shaders and use them but I've something more special. I've a simple texturing shader and a lighting shader. They are seperate shaders and I want to work them together. However if I do texturing with this shader:

//Vertex
layout (location = 0) in vec3 Position;
layout (location = 1) in vec2 TexCoord;
uniform mat4 ModelViewMatrix;
out vec2 TexCoord2;
void main()
{
    gl_Position = ModelViewMatrix * vec4(Position, 1.0);
    TexCoord0 = TexCoord;
}
//Fragment
in vec2 TexCoord0;
out vec4 FragColor;
uniform sampler2D texSampler;
void main() { FragColor = texture2D(texSampler, TexCoord0.st); }

I'm getting a textured cube mesh. Also I want to make it lighted with point and ambient lights. But if use the texturing program cube mesh is not being effected by point lights. Otherwise if I use point lighting shader for cube mesh, I can't get a textured cube mesh.

I can't understand what to do. How to combine them?

1

1 Answers

4
votes

The problem

This is actually touching a more complicated problem of "effect" design in video engines. Books have been written about this, so you will need more literature anyway. However...

Think about the shaders. Each of them provides a part of the end result you want. In essence, though, light shader uses the surface albedo color to calculate the light on it. The very same value that's essentially in your colour texture.

So, the thing you have to do here is use both together. I guess adding the sampler to the lighting shader would be simpler; simply copy the variable there, set up the uniforms on the code part, and you're good to substitute albedo in your light calculations; meaning simply multiplying the calculated light color by the texture color.

To make it work as a part of the bigger system, though... It poses a bigger problem, and might be solved by techniques such as semi-automatic shader parsing and building. You can also create different effects in different shaders and link them together into one program; that would make manual writing easier, but won't ultimately solve the problem of freely applying effects to models.

Code code code

If you'd like a code example, here's a per-vertex lighting/texture/fog shader I wrote quite some time ago. It could probably be done better, but it worked for me well enough. You'll find accompanying fragment shader in the same folder. As you can see, I refactored light calculations into separate functions; these functions could be in different shader files, that would be separately compiled and linked.

To narrow it down to one line:

out_Color = mix(light * cubeTexture, skyColor, fogFactor);

Multiple geometry passes

For completeness, moved from comments; you can use two completely separate pipelines, as long as you're willing to pay the cost of processing each vertex for each pipeline. That's going to limit in you in some ways, but in your example, you could theoretically process texturing and lighting on two separate drawcalls, and only then blend the colors from two separately rendered images.