1
votes

edited: added to the bottom of the question, the creation of the _vertices and _indices members. (and added missing call to glEnableVertexAttribArray in render function).

I have a problem which probably will boil down to me misunderstanding some concept, so writing here in hope of straightening that up.

I am having a problem with what seem to be a rather simple shader, or I guess the problem lay with the data transmitted to the shader.

Basically I am down to trying with one triangle to make this work, and still I get a result which to me seem quite unexpected.

I have the opengl-code and will bring it in here along with the shader source and hopefully someone can point me in the right direction.

first of all, let's start with the shaders, since they are at super simple as is.

vertex-shader:

#version 330
in vec2 vertex_position;
in vec4 color;

smooth out vec4 vertex_color;

void main () {
    gl_Position = vec4(vertex_position, 0.0, 1.0);
    vertex_color = color;
}

fragment-shader: #version 330 smooth in vec4 vertex_color; out vec4 fragment_color;

void main () {
    fragment_color = vertex_color;
}

Expected result:

My expectation is that if I send a triangle with vertex colors = [1,0,0], [0,1,0], [0,0,1] it would give me a triangle which is interpolated from red to blue, and from red to green (if that make sense, it's a quite common example so I believe people know what to expect?)

What I do get seem to be a filled triangle with a solid color which is interpolated between some values leaning toward pale green.

Triangle rendered with this shader So with this information I believe that we can feel pretty certain that the problem is not the shader, but rather the data which we put in there, right?

So heading over to the data..

Vertex struct

typedef struct _gui_vertex {
    union {
        struct {float x, y;};
        float position[2];
    };
    union {
        struct {float r, g, b, a;};
        float color[4];
    };
};

I kind of like the layout like this, and it should be rigid if I am not missing something fundamental, I use the union basically as sort of syntactic sugar, so I can access data in different ways, but float x, y should be memory-wise the same as position[2] so I don't think that the datatype is to blame. leaving the opengl bit which I actually do think is to blame.

I have a class called GUI_Element (or actually gui::element but I will leave namespaces out of it because I think it's more easy to read and understand).

the declaration of the class look something like this:

gui_element.h

class GUI_Element {
public:
    void render();
private:
    void create_buffers();

    gui_vertex _vertices[4];
    GLuint _indices[4];

    GLuint _vertex_buffer;
    GLuint _index_buffer;
};

And for the implementation I will go through firstly the create_buffers(); function and then the render function. Assume that the data were assigned to the _vertices and _indices arrays, I have checked them at various points of execution and they do in deed contain the correct data for the triangle with the red, blue and green vertices, along with the correct indices.

implementation create_buffers()

glGenBuffers (1, &_vertex_buffer);
glBindBuffer (GL_ARRAY_BUFFER, _vertex_buffer);
    glBufferData(
        GL_ARRAY_BUFFER,
        sizeof(tehengine::gui_vertex)*4,
        _vertices,
        GL_STATIC_DRAW
    );

glGenBuffers (1, &_index_buffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _index_buffer);
glBufferData(
    GL_ELEMENT_ARRAY_BUFFER,
    sizeof(GLuint)*4,
    _indices,
    GL_STATIC_DRAW
);

And onto the render function, I will try to keep it as close as I can to my implementation when quoting it here, but I have "simplified it" since it contain dependencies on a wrapper for handling attributes and the likes (the shader-communication) I am pretty certain that the wrapper "works" but am open for the possibility that something is wrong there, if my posted code seem to be correct I could go deeper into that implementation but I believe it's nicer to read if I keep the code to a minimum.

implementation render()

GLShaderProgram *program; // my shader-wrapper, basically keep track of attribs, and the program along with loading, compiling and linking shaders.
program->use();

glBindBuffer (GL_ARRAY_BUFFER, _vertex_buffer);
glBindBuffer (GL_ELEMENT_ARRAY_BUFFER, _index_buffer);

glVertexAttribPointer (
    program->attribute_index("vertex_position"),
    2,
    GL_FLOAT,
    GL_FALSE,
    sizeof(gui_vertex),
    (const GLvoid *) offsetof(gui_vertex, position)
);

glVertexAttribPointer (
    program->attribute_index("color"),
    4,
    GL_FLOAT,
    GL_FALSE,
    sizeof(gui_vertex),
    (const GLvoid *) offsetof(gui_vertex, color)
);

glEnableVertexAttribArray (program->attribute_index("vertex_position"));
glEnableVertexAttribArray (program->attribute_index("color"));

glDrawElements(GL_TRIANGLES, 3, GL_UNSIGNED_INT, (void *)0);
// disable stuff, left out for readability.

So that is pretty much it, in the implementation the program->attribute_index function maps an index to an attribute, using glGetAttribLocation, using a hash-table to keep track of them, nothing really magic. The "setup" for the rendering is really simple, I am doing a no-depth-buffer ortho and there just is not much more to say about it.

Probably the error lay somewhere in the render function and perhaps the glVertexAttribPointer stuff? seems most likely, perhaps I am thinking about the offsets, and stuff the wrong way?

Anyhow, I am thankful that you took the time to look at this, I might add that I would like it to work with glsl 3.30 at highest, since it's what my graphics card can muster (or at least it's the supported version reported).

creating _vertices and _indices in the constructor I do this to fill in the data for _vertices and _indices

_vertices[0] = { {{-50.0f, -50.0f}}, {{1.0f, 0.0f, 0.0f, 1.0f}} };
_vertices[1] = { {{ 50.0f, -50.0f}}, {{0.0f, 1.0f, 0.0f, 1.0f}} };
_vertices[2] = { {{ 50.0f,  50.0f}}, {{0.0f, 0.0f, 1.0f, 1.0f}} };

_indices[0] = 0;
_indices[1] = 1;
_indices[2] = 2;

I did however realize that I did lie a bit earlier, the _indices[2] were actually set to a different vertex. why it looked green and so on. when I corrected it (according to the layout above I get a purple looking triangle (and in the other corner, since the erroneous vertex were actually a forth vertex making up a quad, it had color 1,1,1,1 (white) ).

1
I know you checked that you fill the (local) arrays with the correct data; but please still show us the code to create it (or a debug-print). It still can be wrong.leemes
Absolutely I will update the original question with the information.qrikko
Ah, I don't see any glEnableVertexAttribArray. Not sure if this is needed, since I'm always confused with the different GL versions and how you give GL enough information for attribute data (there are some different ways to do it). But try to add glEnableVertexAttribArray(program->attribute_index("...")) for both attributes...leemes
Also, the color seems to be (0.5, 1.0, 0.5, 1.0). Are those numbers telling you anything? Maybe they are part of some other data in the buffer (so maybe there is some offset error when pointing to them)?leemes
@leemes will let you know how that pan out.. perhaps that could be the course.. will try it out at once :)qrikko

1 Answers

1
votes

As it turns out...

I were just "stupid".. It all were working just fine all along, I did see an interpolated image it all were correct, however I did not see much of it -_-

just a tiny bit as the coordinate system for the ortho seem to be [-1,1] for both axis. I drew my triangle thinking it were [-50, 50] (in both axis) so I saw a part sized one over fifty of the interpolated triangle.

Sorry to waste your time to anyone reading, and thanks for trying to help to the people who did!

(i.e. if I draw a triangle which fits inside the view-port it does work)