1
votes

I've got a strange error on my nexus 4 with OpenGL ES2 when I use vertex array objects.

Here is some informations:

  • Everything work when I don't use VAO
  • Everything work on others device and on an Ipad 2 with and without VAO
  • glGetError() didn't return error
  • Due to the error some glitch appear in the game (some elements take another apparence)
  • My VBO are dynamics (I update them with glBufferData)

Here is the error:

Adreno-ES20(16818): : validate_vertex_attrib_state: No vertex attrib is enabled in a draw call!

And here is my code:

void Renderer::setVertexBuffer( Uint32 stream, const Base* vertexBuffer, std::size_t stride, Uint32 startVertex, Uint32 endVertex )
{
    static const bool VAOSupported = this->isExtensionPresent(VertexArrayObject);
    if( VAOSupported )
    {
        if( vertexBuffer->vao.isReady() == false )
        {
            // Bind VAO.
            glBindVertexArrayOES( vertexBuffer->vao.getId() );

            // Bind filled VBO.
            glCheck( glBindBuffer( GL_ARRAY_BUFFER, vertexBuffer->getId() ) );

            // Set attributs with vertex format.
            this->applyVertexFormat( startVertex, endVertex );

            // Unbind buffer and VAO.
            glBindVertexArrayOES(0);

            vertexBuffer->vao.isReady(true);
        }

        glBindVertexArrayOES( vertexBuffer->vao.getId() );
    }
    else
    {
        glBindVertexArrayOES(0);
        glCheck( glBindBuffer( GL_ARRAY_BUFFER, vertexBuffer->getId() ) );
        this->applyVertexFormat( startVertex, endVertex );
    }
}

////////////////////////////////////////////////////////////
void Renderer::setIndexBuffer( const Buffer* indexBuffer, std::size_t stride )
{
    glCheck( glBindBuffer( GL_ELEMENT_ARRAY_BUFFER, indexBuffer->getId() ) );
    this->usedIndexBufferStride = stride;
}

////////////////////////////////////////////////////////////
void Renderer::applyVertexFormat( Uint32 startVertex, Uint32 endVertex )
{
    const Uint32 stride = this->vertexFormat->getStride();
    for( Uint32 i = 0; i < this->vertexFormat->getAttributCount(); i++ )
    {
        const VertexElement& element = this->vertexFormat->getAttribut(i);

        glCheck( glEnableVertexAttribArray( element.usage ) );
        glCheck( glVertexAttribPointer( element.usage,
                                       element.type,
                                       element.type,
                                       element.normalize,
                                       stride,
                                       BUFFER_OFFSET(element.offset + startVertex * stride ) ) );
    }
}

And here is how I use it :

renderer->setFormat(geometry->getFormat()); // Only save a pointer to the format to use in apply method.
renderer->setVertexBuffer(geometry->getVertexBuffer());
renderer->setIndexBuffer(geometry->getIndexBuffer());
renderer->draw(GL_TRIANGLES, geometry->indiceCount);
1

1 Answers

0
votes

Are you sure usage is an appropriate name for the field that defines which attribute array to associate the pointer with? Buffer objects already have a property called usage (e.g. GL_DYNAMIC_DRAW). location might make more sense.

You have a much more serious issue in your code, however:

element.type cannot be both the type of your data and the number of components. glVertexAttribPointer (...) only accepts 1, 2, 3 or 4 components, an enumerant like GL_FLOAT has a value much larger than 4.

Assuming glCheck( ... ) correctly wraps glGetError (...), this situation should be indicating GL_INVALID_VALUE.

This leads me to believe that your loop in void Renderer::applyVertexFormat( Uint32 startVertex, Uint32 endVertex ) is never triggered.