0
votes

I'm writing a program to draw simple blocks on screen using a shader that gets data from a vertex array. The vertex array is an array of floats that combines x,y,z position values with u,v texture values in the following order:

[ x1, y1, z1, u1, v1, x2, y2, z2, u2, v2, ...]

I created the buffer like so (data is a vector object that I filled with data earlier):

glGenBuffers(1, &data_buffer);
glBindBuffer(GL_ARRAY_BUFFER, data_buffer);
glBufferData(GL_ARRAY_BUFFER, data.size() * sizeof(float), &(data[0]), GL_STATIC_DRAW);
glBindBuffer(GL_ARRAY_BUFFER, 0);

And later I draw it like so:

glUseProgram(city_shader);
glEnableVertexAttribArray(0);
glEnableVertexAttribArray(1);
glActiveTexture(GL_TEXTURE0);

glUniformMatrix4fv(mvpMatrixID, 1, GL_FALSE, &mvp_matrix[0][0]);
glBindTexture(GL_TEXTURE_2D, texture.tex);
glUniform1i(textureID, 0);
glBindBuffer(GL_ARRAY_BUFFER, data_buffer);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 5 * sizeof(GL_FLOAT), (void*)0);
glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, 5 * sizeof(GL_FLOAT), (void*)(3 * sizeof(GL_FLOAT)));

glDrawArrays(GL_QUADS, 0, data.size()/5);

glBindTexture(GL_TEXTURE_2D, 0);
glUseProgram(0);
glDisableVertexAttribArray(0);
glDisableVertexAttribArray(1);

But this just yields a blank screen:

http://imgur.com/IrinBKc

At first, i thought something was wrong with my shader. However, I noticed when I mangled the stride to glVertexAttribPointer like so:

glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 4 * sizeof(GL_FLOAT), (void*)0); //Stride is now only 4, when there are 5 elements per vertex
glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, 5 * sizeof(GL_FLOAT), (void*)(3 * sizeof(GL_FLOAT)));

I get the following (mangled) image:

http://imgur.com/EXJaJWi

This leads me to believe that my shaders and other drawing calls are in fact working correctly (that is, they are applying the textures and transforming the vertices correctly), but only when the data fed to them is wrong in some way. I can only get something drawn when I mangle the stride of the vertex attribute of my buffer. Any suggestions as to why?

EDIT: Also, I've inserted calls to glGetError() with no avail.

EDIT2: Per Peppe's request, here is the code for my shaders:

Vertex Shader:

#version 430 core
layout(location = 0) in vec3 pos;
layout(location = 1) in vec2 uv_coords;
uniform mat4 mvpMatrix;
out vec2 vs_uv;
void main()
{
vs_uv = uv_coords;
gl_Position = mvpMatrix * vec4(pos,1.0);
}

Fragment Shader:

#version 430 core
in vec2 vs_uv;
uniform sampler2D tex;
out vec4 fs_color;
void main()
{
fs_color = vec4(texture(tex,vs_uv));
}

And a subset of the code that populates my buffer:

//West wall - x-axis increases
    float base_x_texture_coord = gen_random_x_texture_coord();
    float base_y_texture_coord = gen_random_y_texture_coord();
    data.push_back(segment_pos_x + half_x_width); data.push_back(0.0f); data.push_back(segment_pos_z + half_z_width);
    data.push_back(base_x_texture_coord); data.push_back(base_y_texture_coord);
    data.push_back(segment_pos_x + half_x_width); data.push_back(height); data.push_back(segment_pos_z + half_z_width);
    data.push_back(base_x_texture_coord); data.push_back(base_y_texture_coord + height);
    data.push_back(segment_pos_x + half_x_width); data.push_back(height); data.push_back(segment_pos_z - half_z_width);
    data.push_back(base_x_texture_coord + z_width); data.push_back(base_y_texture_coord + height);
    data.push_back(segment_pos_x + half_x_width); data.push_back(0.0f); data.push_back(segment_pos_z - half_z_width);
    data.push_back(base_x_texture_coord + z_width); data.push_back(base_y_texture_coord);
1
Did you make sure that attribute 0 is associated with your vertex shader input for the position, and attribute 1 with the input for the texture coordinates? You would use calls like glGetAttribLocation() or glBindAttribLocation(), or layout (location=..) directives in the shader code. - Reto Koradi
On most platforms it should not be an issue, but you really, really really want to write sizeof(GLfloat), not sizeof(GL_FLOAT). (Not an issue as both sizes will be 4 bytes, for both a single-precision float as well as an integer -- the expansion of the GL_FLOAT macro). - peppe
Also, please stop drawing using quads. That's no longer supported in the Core profile. - peppe
RetoKoradi: I've confirmed that the attributes are set up correctly, in the shader they are defined as layout(location = 0) in vec3 pos; layout(location = 1) in vec2 uv_coords; peppe: Thanks for the tips, I've modified my code to use GLfloat instead of GL_FLOAT, but no solution. In the future, I'll convert my renderer to use triangles, but I'd like to get this figured out first. - redsoxfantom
I am not sure that sizeof (GL_FLOAT) necessarily has to be 32-bit. GLenum is a 32-bit type, but the constant values that OpenGL uses are all limited to the lower 16-bits. Often an integer literal will be 32-bit but it does not have to be. No matter what the size of the constant is, you do not want to use that :P - Andon M. Coleman

1 Answers

0
votes

Figured it out! It turns out that my drawing code was working properly, but the coordinates that I was generating for the position of the square were so far away the square wasn't being drawn. Thanks for the help anyway.