I am trying to mimic the behavior of OpenGL glTexEnv with a shader. It is quite complex function but should be doable. The only problem is that the function works differently depending on the texture base internal format. How can I get that information out of a texture? The texture base internal format is given with glTexImage2D function so do I have to save it there to some variable and pass it to the shader with some uniform depending on bound texture, or can I get it somehow with OpenGL?
0
votes
How does the internal format affect the shader?
– Luca
@Luca You might have a texture that has for example only one channel (let's say only Red value, encoded on 32 bits). Now if your shader expects at least two channels (say, Red and Green), you might run into problems when providing your one-channel texture.
– wip
2 Answers
0
votes
It seems that texture blending values are not part of the built in uniforms.
http://www.opengl.org/sdk/libs/OpenSceneGraph/glsl_quickref.pdf
generally there are two ways for mimic the FFP behavior:
- One big Shader, passing values to the shader...
- Creating shaders, for each behavior and internal format for example, dynamically as needed.
I prefer the second way, because it does not result in one big and slow allround shader. Instead of that, you get small specific shaders.
Additionally: This common problem is known as "Combinatorial Shader Explosion" problem.