0
votes

I've been working on porting some existing game framework technology to Android and, consequently, I've been working in OpenGL ES 2.0. I have set up a sprite batching class very similar to XNA's SpriteBatch. While the class works correctly when used with XNA, it's producing odd results when used with my OpenGL framework. In short, it's drawing the sprites in the right place and with the correct rotation, but whichever texture is drawn last is being used for every sprite.

Simply, if I instruct it to draw two instances of Texture A and then one instance of Texture B, then it will draw three instances of Texture B. If I reverse the order, it will draw three instances of Texture A. Whichever texture is sent last replaces the other textures. The obvious answer would be that I am replacing the texture in the required slot. However, I do not believe I am doing so (though my knowledge of OpenGL is limited). In order to help diagnose the issue, I dispersed console output statements throughout my code base wherever a call to the OpenGL API was used (excluding shader compilation, linking, etc). After running the code to a break point after the first frame, the output seems to suggest that the correct texture is assigned at each point.

GL: Setting front-face direction to CCW.
GL: Generating texture 1.
GL: Binding texture 1 to Texture2D.
GL: Setting texture data for Texture2D.
GL: Generating mipmap for Texture2D.

GL: Generating texture 2.
GL: Binding texture 2 to Texture2D.
GL: Setting texture data for Texture2D.
GL: Generating mipmap for Texture2D.
GL: Generating texture 3.
GL: Binding texture 3 to Texture2D.
GL: Setting texture data for Texture2D.
GL: Generating mipmap for Texture2D.

GL: Setting Viewport to 0,0,720,1134.
GL: Setting clear colour to: 0,0,0,0.
GL: Clearing Color Buffer.
GL: Setting blend equation separate to FuncAdd, FuncAdd.
GL: Setting blend function separate to One, OneMinusSrcAlpha, One, OneMinusSrcAlpha.
GL: Setting TextureMinFilter to 9987.
GL: Setting TextureMagFilter to 9729.
GL: Setting TextureWrapS to 33071.
GL: Setting TextureWrapT to 33071.
GL: Setting shader parameter 0 to Ozymandias.Mathematics.Matrix4. (MVP Matrix)

GL: Setting active texture unit to Texture0.
GL: Binding texture 2 to Texture2D.
GL: Setting shader parameter 1 to 0. (Texture Slot)
GL: Setting vertex attribute pointer for attribute 1. Current Texture ID: 2.
GL: Enabling vertex attribute array for attribute 1.
GL: Setting vertex attribute pointer for attribute 0. Current Texture ID: 2.
GL: Enabling vertex attribute array for attribute 0.
GL: Setting vertex attribute pointer for attribute 2. Current Texture ID: 2.
GL: Enabling vertex attribute array for attribute 2.
GL: Drawing elements. Current Texture ID: 2.

GL: Setting active texture unit to Texture0.
GL: Binding texture 1 to Texture2D.
GL: Setting shader parameter 1 to 0.
GL: Setting vertex attribute pointer for attribute 1. Current Texture ID: 1.
GL: Enabling vertex attribute array for attribute 1.
GL: Setting vertex attribute pointer for attribute 0. Current Texture ID: 1.
GL: Enabling vertex attribute array for attribute 0.
GL: Setting vertex attribute pointer for attribute 2. Current Texture ID: 1.
GL: Enabling vertex attribute array for attribute 2.
GL: Drawing elements. Current Texture ID: 1.

GL: Swapping Buffers.

The log certainly seems to indicate that the correct texture was assigned at the time of each draw and attribute call. Unfortunately, this process touches on quite a lot of code. The texture, graphics, and sprite batching systems are clearly at the centre, but it also tangentially touches on the effect system. I'll try to avoid posting unnecessary code without removing anything that might prove important.

The sprite renderer runs a setup before flushing. Any parameters that don't exist on the effect are ignored.

private void Setup()
{
    graphicsDevice.BlendState = blendState;
    graphicsDevice.SamplerState = samplerState;

    if (!effect.ApplyAfterParamters)
        effect.Apply(0); // Apply first pass if effect applied before parameters set.

    Matrix4 mvpMatrix = (Matrix4)(GraphicsDevice.Viewport.GetViewMatrix() * transformMatrix);
    effect.SetParameter("MVPMatrix", mvpMatrix);
    effect.SetParameter("TextureSlot", 0);

    if (effect.ApplyAfterParamters)
        effect.Apply(0); // Apply first pass if effect applied after parameters set.
}

It then iterates through the sprites and draws each batch. It's worth noting that the problem occurred even when each sprite was treated as its own batch.

private void Flush()
{
    if (sprites.Count == 0)
        return;

    int batchStart = 0;
    Texture batchTexture = sprites[0].Texture;

    for (int i = 1; i < sprites.Count; i++)
    {
        Texture currentTexture = sprites[i].Texture;

        if (currentTexture.SortingID != batchTexture.SortingID)
        {
            if (i > batchStart)
                RenderBatch(batchTexture, batchStart, i - batchStart);

            batchTexture = currentTexture;
            batchStart = i;
        }
    }

    RenderBatch(batchTexture, batchStart, sprites.Count - batchStart);
    sprites.Clear();
}

private void RenderBatch(Texture texture, int first, int count)
{
    // Oversized batches and buffer capacity handled here.

    // Set texture.
    texture.SetActive(0); // Sets active slot to slot 0 and binds the texture.
    effect.SetParameter("SpriteTexture", texture);
    effect.SetParameter("Texture", texture);
    if (effect.ApplyAfterParamters)
        effect.Apply(0);

    // Buffer position for draw offset.
    int drawOffset = bufferPosition;

    // Render sprites.
    for (int i = 0; i < count; i++)
        RenderSprite(sprites[first + i]);

    int drawCount = bufferPosition - drawOffset;

    // Draw primitives.
    graphicsDevice.DrawUserIndexedPrimitives<VertexPositionColorTexture>(PrimitiveType.TriangleList, vertices, drawOffset * VerticesPerSprite, drawCount * VerticesPerSprite, indices, 0, drawCount * 2);
}

// RenderSprite populates the vertex array.

This is how a texture is made active and how effect texture parameters are set.

// Effect setting texture parameter.
protected override void SetParameterImpl(int index, Graphics.Texture value)
{
    GL.Uniform1(index, value.SlotIndex);
}

// Texture.SetActive
public override void SetActive(int index)
{
    // Index checking here.
    GL.ActiveTexture((TextureUnit)((int)TextureUnit.Texture0 + index));
    GL.BindTexture(TextureTarget.Texture2D, this.TextureID);
    GLHelper.CheckError();
    this.index = index;
}

Finally, this is how the OpenGL ES 2.0 graphics device handles drawing the primitives.

protected override void DrawUserIndexedPrimitivesImpl<T>(PrimitiveType primitiveType, T[] vertices, int vertexOffset, int numVertices, short[] indices, int indexOffset, int primitiveCount, VertexDeclaration declaration)
{
    // Validation here.

    var vHandle = GCHandle.Alloc(vertices, GCHandleType.Pinned);
    var iHandle = GCHandle.Alloc(indices, GCHandleType.Pinned);

    ShaderVertexMap binding = currentProgram.GetBindings(declaration); // Maps vertex struct to shader program attributes.
    ShaderAttribute attribute;

    for (int i = 0; i < binding.Length; i++)
    {
        if (binding[i].Index >= 0)
        {
            attribute = currentProgram.Attributes[binding[i].Index];

            GL.VertexAttribPointer(attribute.Index, binding[i].Size, binding[i].Type, binding[i].Normalised, declaration.Stride, IntPtr.Add(vHandle.AddrOfPinnedObject(), declaration.Elements[i].Offset));
            GL.EnableVertexAttribArray(attribute.Index);
        }
    }
    GLHelper.CheckError();

    uint[] cIndices = new uint[indices.Length];

    for (int i = 0; i < indices.Length; i++)
        cIndices[i] = (uint)indices[i];

    GL.DrawElements(GetMode(primitiveType), indices.Length - indexOffset, DrawElementsType.UnsignedShort, iHandle.AddrOfPinnedObject());
    GLHelper.CheckError();

    vHandle.Free();
    iHandle.Free();
}

I apologise for such large expanses of code, I did my best to reduce it to what seemed most pertinent. Obviously there's a good deal of related code in texture classes, effects, blend states, etc. However, I felt they were unlikely to be involved, but I'm happy to include them if necessary. I'm stumped, and I'd really appreciate any insight someone more experienced may be able to offer. I suspect I've missed something pretty fundamental somewhere.

Addendum 1 I intended to include this originally, but it slipped my mind. I have used a number of simple shaders during testing, but the one I am using most commonly is a slightly modified version of an example from the OpenGL ES 2.0 Programming Guide. The vertex shader:

uniform mat4 MVPMatrix;
attribute vec4 Position;
attribute vec4 Colour;
attribute vec2 TextureCoordinate;
varying vec4 vColour;
varying vec2 vTextureCoordinate;

void main()
{
    vColour = Colour;
    gl_Position = MVPMatrix * Position;
}

The fragment shader:

precision mediump float;
uniform sampler2D Texture;
varying vec4 vColour;
varying vec2 vTextureCoordinate;

void main()
{
    gl_FragColor = texture2D(Texture, vTextureCoordinate) * vColour;
}

*Note: I have transcribed these by hand, so it's possible I've made small typos in doing so.

1

1 Answers

0
votes

The problem exists in the GraphicsDevice.DrawUserIndexedPrimitivesImpl<T> method. The method, as presented here, does not take into account the index or vertex offsets. It also incorrectly determines the number of elements to draw. Consequently, it would draw the index array (~12000 indices) for each and every draw call. Blending had not been enabled, so this was not made clear by transparent textures on a black background.

The problem was fixed by correctly calculating the offset address and length of the index array.

protected override void DrawUserIndexedPrimitivesImpl<T>(PrimitiveType primitiveType, T[] vertices, int vertexOffset, int numVertices, short[] indices, int indexOffset, int primitiveCount, VertexDeclaration declaration)
{
    if (vertices.Length == 0 || primitiveCount == 0)
        return;

    //source: https://github.com/mono/MonoGame/blob/develop/MonoGame.Framework/Graphics/GraphicsDevice.OpenGL.cs
    GCHandle vHandle = GCHandle.Alloc(vertices, GCHandleType.Pinned);
    GCHandle iHandle = GCHandle.Alloc(indices, GCHandleType.Pinned);
    IntPtr vertexPointer = IntPtr.Add(vHandle.AddrOfPinnedObject(), declaration.Stride * vertexOffset);
    IntPtr indexPointer = IntPtr.Add(iHandle.AddrOfPinnedObject(), sizeof(short) * indexOffset);

    ShaderVertexAttributeMap map = currentProgram.GetMap(declaration);

    for (int i = 0; i < map.Length; i++)
    {
        if (map[i].Index > -1)
        {
            GL.VertexAttribPointer(map[i].Index, map[i].Size, map[i].Type, map[i].Normalised, declaration.Stride, IntPtr.Add(vertexPointer, declaration.Elements[i].Offset));
            GL.EnableVertexAttribArray(map[i].Index);
        }
    }

    GLHelper.CheckError();

    GL.DrawElements(GLConverter.Convert(primitiveType), IndexCount(primitiveType, primitiveCount), DrawElementsType.UnsignedShort, indexPointer);

    GLHelper.CheckError();

    vHandle.Free();
    iHandle.Free();
}

    private int IndexCount(PrimitiveType type, int count)
    {
        switch (type)
        {
            case PrimitiveType.LineList: return 2 * count;
            case PrimitiveType.LineStrip: return count + 1;
            case PrimitiveType.TriangleList: return 3 * count;
            case PrimitiveType.TriangleStrip: return count + 2;
            default: throw new ArgumentException("Unrecognised primitive type.");
        }
    }

So far this has worked without incident. I will update this answer if it happens that there are problems.