2
votes

I'm a little confused about usage of ARRAY_INDEX_BUFFER (Index Buffers) vs ARRAY_BUFFER (Array Buffers) and am looking for clarification / better understanding.

Specifically, I'm confused about how (Given the presences of multiple array buffers) how does WebGL and an index buffer know which array buffer to reference?*

Using code obtained from Mozilla's Creating 3D objects using WebGL Demo as the basis, I understand array buffers are assigned and initialized as

cubeVerticesBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, cubeVerticesBuffer);
var vertices = [
    // Front face
    -1.0, -1.0,  1.0,
     1.0, -1.0,  1.0,
     1.0,  1.0,  1.0,
    -1.0,  1.0,  1.0,

    // Back face
    -1.0, -1.0, -1.0,
    -1.0,  1.0, -1.0,
     1.0,  1.0, -1.0,
     1.0, -1.0, -1.0,

    // Top face
    -1.0,  1.0, -1.0,
    -1.0,  1.0,  1.0,
     1.0,  1.0,  1.0,
     1.0,  1.0, -1.0,

    // Bottom face
    -1.0, -1.0, -1.0,
     1.0, -1.0, -1.0,
     1.0, -1.0,  1.0,
    -1.0, -1.0,  1.0,

    // Right face
     1.0, -1.0, -1.0,
     1.0,  1.0, -1.0,
     1.0,  1.0,  1.0,
     1.0, -1.0,  1.0,

    // Left face
    -1.0, -1.0, -1.0,
    -1.0, -1.0,  1.0,
    -1.0,  1.0,  1.0,
    -1.0,  1.0, -1.0
];
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.STATIC_DRAW);

And then the indices are stored in the index buffer as

cubeVerticesIndexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, cubeVerticesIndexBuffer);  
var cubeVertexIndices = [
    0,  1,  2,      0,  2,  3,    // front
    4,  5,  6,      4,  6,  7,    // back
    8,  9,  10,     8,  10, 11,   // top
   12, 13, 14,     12, 14, 15,   // bottom
   16, 17, 18,     16, 18, 19,   // right
   20, 21, 22,     20, 22, 23    // left
 ]
 gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint16Array(cubeVertexIndices), gl.STATIC_DRAW);

But given there is also a color array buffer defined as

var colors = [
    1.0,  1.0,  1.0,  1.0,    // Front face: white
    1.0,  0.0,  0.0,  1.0,    // Back face: red
    0.0,  1.0,  0.0,  1.0,    // Top face: green
    0.0,  0.0,  1.0,  1.0,    // Bottom face: blue
    1.0,  1.0,  0.0,  1.0,    // Right face: yellow
    1.0,  0.0,  1.0,  1.0     // Left face: purple
];
var generatedColors = [];
for (j=0; j<6; j++) {
    var c = colors[j];
    for (var i=0; i<4; i++) {
      generatedColors = generatedColors.concat(c);
    }
}

cubeVerticesColorBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, cubeVerticesColorBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(generatedColors), gl.STATIC_DRAW);

But then once it gets to the draw routine, the ordering is

gl.bindBuffer(gl.ARRAY_BUFFER, cubeVerticesBuffer);
gl.vertexAttribPointer(vertexPositionAttribute, 3, gl.FLOAT, false, 0, 0);

gl.bindBuffer(gl.ARRAY_BUFFER, cubeVerticesColorBuffer);
gl.vertexAttribPointer(vertexColorAttribute, 4, gl.FLOAT, false, 0, 0);

gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, cubeVerticesIndexBuffer);
gl.drawElements(gl.TRIANGLES, 36, gl.UNSIGNED_SHORT, 0);

I'm assuming that the bindBuffer method tells the WebGL "state machine" to make a generic buffer "active" and that the gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER makes a new index buffer active based on the current generic buffers

Somehow, I don't think my understanding of what was going on with that is totally correct. Any clarification would be greatly appreciated. Specifically I'd like at some point to add additional objects, with different buffers (say a sphere or a torus, with a whole other color scheme)

But I understand the drawElements( just draws based on the indexes rather than the actual arrays done by drawArrays(

correct?

2

2 Answers

5
votes

The hidden curse of OpenGL and variants (WebGL, etc) is the state machine. When you call glBindBuffer(GL_ARRAY_BUFFER, buffer), the state machine sets buffer as the active array buffer. Likewise, when you call glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexbuffer), the state machine sets indexbuffer as the active element array buffer (or active index buffer).

When calling glVertexAttribPointer, the data for the given attribute index comes from the active array buffer. This is exactly how you have described it in your last code snippet. When you come to issue a draw call (such as glDrawElements), the active index buffer is used to index into the attributes.

Specifically, I'm confused about how (Given the presences of multiple array buffers) how does WebGL and an index buffer know which array buffer to reference?

The most important point to realise is that a single vertex is not simply its position, but a combination of all of its attributes. In your case, a position and a color make up a single vertex. Now, looking at the attributes you have specified:

Position: | pos0   | pos1   | pos2   | pos3   | pos4   | pos5   | pos6   |
Color:    | color0 | color1 | color2 | color3 | color4 | color5 | color6 |

Each index in your index buffer refers to a combination of both the position and a color attribute. That is, index 0 will fetch pos0 and color0; index 5 will fetch pos5 and color5.

To answer your question, the index buffer refers to all array buffers that have been referenced when calling glVertexAttribPointer.

(Strictly speaking, an index buffer refers to all enabled attributes. The last parameter of glVertexAttribPointer lets you specify an offset into the active array buffer so that multiple sets of attributes can come from the same array buffer.)

1
votes

gl.vertexAttribPointer tells webgl what kind of data to expect in the buffer currently bound by gl.bindBuffer. Then in the shaders you program how you want to use the data you provided in the buffers.

Its kinda like this. Suppose in the shader I want to use 2 buffers, A and B, where I want for each vertex 3 floats from A and 2 floats from B.

gl.bufferData sends an array of consecutive floats* to the GPU and the GPU doesnt know how to interpret the array of consecutive floats.

So I need to use gl.vertexAttribPointer to tell the GPU how the data is formated and that is basically what it does.

But, the way the webgl/opengl API is written I cannot just do bufferA.vertexAttribPointer() but instead I have to first set an imaginary global variable BOUND_BUFFER with 'gl.bindBuffer'. Then you can imagine the vertexAttribPointer is implemented to act on the global BOUND_BUFFER "instance" and does whatever it need to do.

  • the float can be any valid webgl type of course.