0
votes

I seem to be having some trouble drawing objects in OpenGL using VBOs. I've attempted to copy the example from: http://www.opengl.org/wiki/VBO_-_just_examples (number 2) but I can't get a plane to appear on screen.

Vertex.h:

#include <freeglut>

struct Vertex {
    GLfloat position[3];
    GLfloat normal[3];
    GLfloat *uvs[2];
    unsigned short uvCount;
};

Triangles.h:

#include <GL/glew.h>
#include "Vertex.h"

class Triangles {
public: 
    Triangles(GLuint program, Vertex *vertices, unsigned int vertexCount, unsigned int *indices[3], unsigned int indiceCount);
    ~Triangles();
    void Draw();

private:
    GLuint program;
    GLuint VertexVBOID;
    GLuint IndexVBOID;
    GLuint VaoID;

    unsigned int *indices[3];
    unsigned int indiceCount;
};

Triangles.cpp:

#include "Triangles.h"
#include <stdio.h>
#include <stddef.h>

Triangles::Triangles(GLuint program, unsigned int *indices[3], unsigned int indiceCount) {
    memcpy(this->indices, indices, sizeof(int) * indiceCount * 3);
    this->indiceCount = indiceCount;
    this->program = program;

    glGenVertexArrays(1, &VaoID);
    glBindVertexArray(VaoID);

    glGenBuffers(1, &VertexVBOID);
    glBindBuffer(GL_ARRAY_BUFFER, VertexVBOID);
    glBufferData(GL_ARRAY_BUFFER, sizeof(Vertex) * vertexCount, vertices, GL_STATIC_DRAW);

    GLuint attributeLocation = glGetAttribLocation(program, "position");
    glEnableVertexAttribArray(attributeLocation);
    glVertexAttribPointer(attributeLocation, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid *)(offsetof(Vertex, position)));

    attributeLocation = glGetAttribLocation(program, "normal");
    glEnableVertexAttribArray(attributeLocation);
    glVertexAttribPointer(attributeLocation, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid *)(offsetof(Vertex, normal)));

    glGenBuffers(1, &IndexVBOID);
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, IndexVBOID);
    glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(unsigned int) * 3 * indiceCount, indices, GL_STATIC_DRAW);
};

Triangles::~Triangles() {
    glDisableVertexAttribArray(glGetAttribLocation(program, "position"));
    glDisableVertexAttribArray(glGetAttribLocation(program, "normal"));

    glDeleteBuffers(1, &VertexVBOID);
    glDeleteBuffers(1, &IndexVBOID);
    glDeleteVertexArrays(1, &VaoID);
}

void Triangles::Draw() {
    glBindVertexArray(VaoID);
    glDrawElements(GL_TRIANGLES, indiceCount, GL_UNSIGNED_INT, 0);
};

Excerpt from main.cpp (creating triagle object):

Vertex vertices[4];
vertices[0].position[0] = -1;
vertices[0].position[1] = 1;
vertices[0].position[2] = 0;
vertices[0].normal[0] = 0;
vertices[0].normal[0] = 0;
vertices[0].normal[0] = 1;
vertices[0].uvCount = 0;

vertices[1].position[0] = 1;
vertices[1].position[1] = 1;
vertices[1].position[2] = 0;
vertices[1].normal[0] = 0;
vertices[1].normal[0] = 0;
vertices[1].normal[0] = 1;
vertices[1].uvCount = 0;

vertices[2].position[0] = 1;
vertices[2].position[1] = -1;
vertices[2].position[2] = 0;
vertices[2].normal[0] = 0;
vertices[2].normal[0] = 0;
vertices[2].normal[0] = 1;
vertices[2].uvCount = 0;

vertices[3].position[0] = -1;
vertices[3].position[1] = -1;
vertices[3].position[2] = 0;
vertices[3].normal[0] = 0;
vertices[3].normal[0] = 0;
vertices[3].normal[0] = 1;
vertices[3].uvCount = 0;

unsigned int **indices;
indices = new unsigned int*[2];
indices[0] = new unsigned int[3];
indices[0][0] = 0;
indices[0][1] = 1;
indices[0][2] = 2;
indices[1] = new unsigned int[3];
indices[1][0] = 2;
indices[1][1] = 3;
indices[1][2] = 0;

Triangles *t = new Triangles(program, vertices, 4 indices, 2);

createShader(GLenum , char *):

GLuint createShader(GLenum type, char *file) {
    GLuint shader = glCreateShader(type);
    const char *fileData = textFileRead(file);
    glShaderSource(shader, 1, &fileData, NULL);

    glCompileShader(shader);
    return shader;
}

Shader loading:

    GLuint v = createShader(GL_VERTEX_SHADER);
    GLuint f = createShader(GL_FRAGMENT_SHADER, "fragmentShader.frag");

    program = glCreateProgram();

    glAttachShader(program, v);
    glAttachShader(program, f);
    glLinkProgram(program);

    glUseProgram(program);

vertexShader.vert:

in vec3 position;
in vec3 normal;

out vec3 a_normal;

void main() {
    gl_Position = vec4(position, 1.0);
}

fragmentShader.frag:

in vec3 a_normal;

out vec4 out_color;

void main() {
    out_color = vec4(1.0, 1.0, 1.0, 1.0);
}

Please let me know if more code is needed. As a side note everything compiles just fine, I just don't see the plane that I have constructed on screen (maybe because I didn't use colors?)

My OpenGL information is as follows:

  • Vendor: ATI Technologies Inc.
  • Renderer: ATI Radeon HD 5700 Series
  • Version: 3.2.9756 Compatibility Profile Context
  • Extensions: extensions = GL_AMDX_name_gen_delete GL_AMDX_random_access_target GL_AMDX_vertex_shader_tessellator GL_AMD_conservative_depth GL_AMD_draw_buffers_blend GL_AMD_performance_monitor GL_AMD_seamless_cubemap_per_texture GL_AMD_shader_stencil_export GL_AMD_texture
1
Some notes, in no particular order: Do you do error checking (glGetError)? Where is modelviewproj set? Why are you using glRotate/Translate/PopMatrix if you're using a custom mvp matrix? Are you certain your attribute indices are 0 and 1?Tim
Unfortunately I do not do error checking, I mainly just copied the example and I am trying to run its code. I used matrix functions because I didn't realize the vertex shader would effect that. I assumed the matrix set to the matrix at the top of the stack (the one I pushed before drawing.) As for indices, I'm not exactly sure what you mean. I just drew the vertices on paper and decided the indices based on that. Unless you mean the normal and position values in glEnableVertexArrayAttrib(), I just used the values in the example, I'm not sure how to change them or what they effect.Benjamin Danger Johnson
In my project I also draw a cube (using glBegin and glEnd) as a test, and it seems it dissapears when the shaders are active and working. Could this be a sign of a shader problem? if it is, does anyone know how I can fix it?Benjamin Danger Johnson
Can you try to strip your program down to the absolute barebones needed to draw a triangle? Get rid of all the translation and rotation code, comment out anything with diffuse lighting and normals? The more complexity you add the more potential problems that exist. And please try to keep your question up to date with whatever your latest code is, it's confusing to see a lot of superfluous code.Tim
I striped it down a bit. I took out the extra stuff in the shaders (since I am just trying to get something to appear anyway), moved the Triangle code a bit and took out the matrix stuff (since it doesn't work) You can basically ignore everything below Triangles.cpp, that stuff is mostly reference. I think I am doing everything right with creating the VBO and drawing, I am pretty convinced I must be failing with shaders somehow. I can just pull the stuff about shaders if need be, but I think this should be everything I need to actually draw a shape.Benjamin Danger Johnson

1 Answers

0
votes

In response to your comments:

Unfortunately I do not do error checking

You should always add some OpenGL error checking, it will save you from so many problems. It should look something like the following:

int err = glGetError();
if(err != 0) {
   //throw exception or log message or die or something
}

I used matrix functions because I didn't realize the vertex shader would effect that. I assumed the matrix set to the matrix at the top of the stack (the one I pushed before drawing.)

This is an incorrect assumption. The only variable which references deprecated the matrix stack is special (though deprecated) variable gl_ModelViewProjectionMatrix. What you currently have there is just an unused, uninitialized matrix, which is totally ignoring your matrix stack.

As for indices, I'm not exactly sure what you mean. I just drew the vertices on paper and decided the indices based on that.

I'm not referring to the indices of the triangle in your index buffer, but rather the first parameter to your glAttrib* functions. I suppose 'attribute location' is a more correct term than index.

glEnableVertexAttribArray(0);
glVertexAttribPointer(0, ...   //attrib location 0

glEnableVertexAttribArray(1);  
glVertexAttribPointer(1, ...   //attrib location 1

You seem to just be randomly assuming that "0" and "1" map to "position" and "normal". This is not a safe assumption to make. You should be querying the attribute location values for "position" and "normal" with glGetAttribLocation, and then using that value to glEnableVertexAttribArray and glVertexAttribPointer.