0
votes

I have a vertex shader that I'm trying to render the content from, using a vertex shader. I had it working when I was using plain vec3 "in" attribute, but now I need to pass certain informations. I made a struct that I'm going to use for the vertex shader, a geometry shader and a compute shader.

struct StellarEntity
{
    vec3 position;
    vec3 velocity;
    double mass;
    double radius;
};

Here's my vertex shader (of course the struct is in there) :

#version 430

uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;

layout(location = 0) in StellarEntity in_entity;

void main()
{
    gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(in_entity.position, 1.0);
}

What I think the problem may be, is the way I bind my vertex attribute.

vao = new VAO();
//"entities" is a StellarEntity[256] with all data initialized
vao.BufferData(BufferTarget.ArrayBuffer, entities, BufferUsageHint.StaticDraw);
vao.VertexAttribPointer(0, 3, VertexAttribPointerType.Double, false, 0, 0);
vao.VertexAttribPointer(0, 3, VertexAttribPointerType.Double, false, 0, 0);
vao.VertexAttribPointer(0, 1, VertexAttribPointerType.Double, false, 0, 0);
vao.VertexAttribPointer(0, 1, VertexAttribPointerType.Double, false, 0, 0);

I don't really understand the stride and offset (last two parameters of VertexAttribPointer), so I put them at 0...

EDIT : forgot to put my BindAttributeLocation call, so here it is :

//"program" being my shaderprogram, after the linking (worked fine for simple shaders)
program.BindAttribLocation(0, "in_entity");
2

2 Answers

1
votes

Firstly, you cannot use a struct as an input to your vertex shader. You should just use four separate vertex attributes:

layout(location = 0) in vec3 position;
layout(location = 1) in vec3 velocity;
layout(location = 2) in double mass;
layout(location = 3) in double radius;

(I think you should think twice about using doubles here. On GPUs, they do not perform so well as floats.)

Secondly, the stride is the byte offset between the attribute for one vertex, and the attribute for the next. In your case it should be the size of your StellarEntity struct (in bytes). The offset is the offset in bytes from the beginning of the bound buffer to the attribute for the first vertex. If you change mass and radius to floats, these would be: 0, 12, 24 & 28. Your 'VertexAttribPointer' calls then become:

vao.VertexAttribPointer(0, 3, VertexAttribPointerType.Float, false, 32, 0);
vao.VertexAttribPointer(1, 3, VertexAttribPointerType.Float, false, 32, 12);
vao.VertexAttribPointer(2, 1, VertexAttribPointerType.Float, false, 32, 24);
vao.VertexAttribPointer(3, 1, VertexAttribPointerType.Float, false, 32, 28);

(Disclaimer: I do not use C#, so I'm assuming that the OpenGl bindings are straight forward and that the vec3 type uses floats not doubles)

Lastly, program.BindAttribLocation(0, "in_entity"); wont do anything if you explicitly set the locations of the vertex inputs in the shader.

0
votes

Solved the problem, it was a stride and offset problem. For those looking for answers: the stride is the total size in bytes of one element in the array, and the offset is the number of bytes before the field. So for a strict :

struct A {
    Vector4f pos;
    Vector4f norm;
}

The stride in both calls will be 32. First offset will be 0, and the second will be 16.