2
votes

I'm having an issue where my glsl 130 code wont run properly on my somewhat modern (ATI 5850) hardware while the identical code runs perfectly fine on an older laptop with a NVIDIA card I have.. This is the case no mater what opengl context I use. What occurs is the vectors: in_position, in_colour and in_normal don't bind properly on the new hardware. It appears I am forced to a newer version of glsl (330) on newer hardware.

Here is the glsl code for the vertex shader. Its fairly simple and basic.

#version 130

uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;
uniform mat4 normalMatrix;

in vec4 in_position;
in vec4 in_colour;
in vec3 in_normal;

out vec4 pass_colour;

smooth out vec3 vNormal;

void main()
{
   gl_Position = projectionMatrix * viewMatrix * modelMatrix * in_position;
   vec4 vRes = normalMatrix*vec4(in_normal, 0.0); 
   vNormal = vRes.xyz;
   pass_colour = in_colour;
}

What happens is data for:

in vec4 in_position;
in vec4 in_colour;
in vec3 in_normal;

doesn't bind or not fully. The values are oddly distorted. From my testing everything else works properly. Changing the version to 330 and using the location keyword fixes the issue, but that also makes the code not compatible with older verions of opengl...

Here is a sample of the code I use to specify these locations.

for the program:

glBindAttribLocation(LD_standard_program, 0, "in_position");
glBindAttribLocation(LD_standard_program, 1, "in_colour");
glBindAttribLocation(LD_standard_program, 2, "in_normal");

and later for the data itself:

--- code to buffer vertex data
glEnableVertexAttribArray(0);
glVertexAttribPointer((GLuint) 0, 4, GL_FLOAT, GL_FALSE, 0, 0);
--- code to buffer colour data
glEnableVertexAttribArray(1);
glVertexAttribPointer((GLuint) 1, 4, GL_FLOAT, GL_FALSE, 0, 0);
--- code to buffer normal data
glEnableVertexAttribArray(2);
glVertexAttribPointer((GLuint) 2, 3, GL_FLOAT, GL_FALSE, 0, 0);

My question is: Isn't opengl supposed to be backwards compatible?? I'm starting to be afraid that I'll have to write seperate shaders for ever single version of opengl to make my program run on different hardware... Since binding these attributes is very basic functionality I doubt it's a bug in the ATI implementation...

1
It's more likely that your code encountered either an NVIDIA bug that allowed your non-conformant code to work or an ATI bug that caused your conforming code to fail. Or both. Anyway, where did you put the code for binding your vertex attributes? - Nicol Bolas
Can you elaborate on 'doesn't bind or not fully'? What does that mean? Any glGetErrors, or shader link errors? - Tim

1 Answers

1
votes

Are you calling glBindAttribLocation before glLinkProgram? Calling after won't give any effect, because the vertex attributes are assigned indices only during glLinkProgram.

In GLSL 3.30+ there is better way of specifiying attribute indices directly in GLSL code:

layout(location=0) in vec4 in_position;
layout(location=1) in vec4 in_colour;
layout(location=2) in vec3 in_normal;  

Edit: oh, I skipped the part you tried layout keyword already.