2
votes

I've been trying to get texturing working under opengl 3.1 on an intel HD graphics 2000/3000 graphics card running on ubuntu 13.04. The issue i'm running into is textures either dont load and the basic triangle i'm trying to texture comes up black, or some color from the texture will get loaded but not the entire image. I get the same outcomes using a raw image file as the source or loading a jpeg with libjpeg.

My shaders are as follows:

Vertex shader

#version 130

in vec3 vert;
in vec2 vertTextCoord;

out vec2 fragTexCoord;

void main(){
    fragTexCoord = vertTextCoord;
    gl_Position = vec4(vert,1);

}

Fragment shader

#version 130

uniform sampler2D tex;

in vec2 fragTexCoord;

out vec4 finalColor;

void main() {

    finalColor = texture(tex, fragTexCoord);
}

code for creating the texture glGenTextures( 1, &texture); glBindTexture( GL_TEXTURE_2D, texture);

imgdata image_data = loadRaw("texture.bmp", 256, 256);

glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 256, 256, 0, GL_RGB, GL_UNSIGNED_BYTE,image_data.data);

and the render function

void display(void){
glClearColor( 1.f, 0.f, 0.f, 0.f);
glClear(GL_COLOR_BUFFER_BIT );

//load program to use
glUseProgram(shaderprogram);

GLint uniform = glGetUniformLocation(shaderprogram, "tex");
if(uniform == -1){
    throw std::runtime_error(std::string("program uniform not found: tex"));
}

// bind the texture
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, texture);
glUniform1i(uniform, 0);

//load vertex array to use
glBindVertexArray(cubeVAO);

//draw triangle
glDrawArrays(GL_TRIANGLES, 0, 3);

//unbind for next pass
glBindVertexArray(0);
glUseProgram(0);

glfwSwapBuffers();
}

the geometry and texture coordinates

GLfloat data[] = {
//X   Y    Z     U    V
0.0f, 0.8f, 0.0f,      0.5f, 1.0f,
-0.8f, -0.8f, 0.0f,    0.0f, 0.0f,
0.8f, -0.8f, 0.0f,     1.0f, 0.0f
};

VBO and VAO being setup

glGenVertexArrays(1, &cubeVAO);
glBindVertexArray(cubeVAO);
glGenBuffers(1, &cubeVBO);
glBindBuffer(GL_ARRAY_BUFFER, cubeVBO);

glBufferData(GL_ARRAY_BUFFER, sizeof(data), data, GL_STATIC_DRAW);


glEnableVertexAttribArray(vertprog);
glVertexAttribPointer(vertprog, 3, GL_FLOAT, GL_FALSE, 5*sizeof(GLfloat), NULL);

glEnableVertexAttribArray(verttexprog);
glVertexAttribPointer(verttexprog, 2, GL_FLOAT, GL_TRUE, 5*sizeof(GLfloat), (const GLvoid*)(3*sizeof(GLfloat)));

glBindVertexArray(0);
1
How are you creating cubeVAO?defube
"some color from the texture will get loaded" leads me to think the texture coordinates are ill-defined - so sometimes you'll see just part of the texture (perhaps a single texel) stretched over the whole triangle.GuyRT
also worth considering: what opengl extensions are supported by the intel chipset? I like GLView for a quick and easy way to discover what level of compliance is supportedGMasucci
Does it always produce the same wrong result every time you run it, or does it produce different results each time you run it, even if you didn't change any code?Aaron
I think texture2D is deprecated and replaced by texture in newer OpenGL versions, maybe you could try changing that?Aaron

1 Answers

0
votes

You have not shown the code where you determine the value of verttexprog. From the code you have currently, I have to assume that verttexprog (this is a terrible variable name, by the way) is uninitialized.

You should initialize verttexprog to glGetAttribLocation (program, "vertTextCoord"); after you link your program. Likewise, do not query uniform locations each frame, the only time they change is after you (re-)link a GLSL program.


finalColor = texture(tex, fragTexCoord);

with:

finalColor = texture(tex, gl_FragCoord.st);

This is not the behavior you want, but it is a great way to show that your texture is loaded fine.