3
votes

Suppose you have a simple Tileset for a game, like this:

enter image description here

Now, when dealing with a simple GUI framework such as .NET, it would be rather easy to load that image, select portions of it, and then draw it tile-by-tile. However, when using OpenGL, this process seems a little more...erm, unique. I can easily load that image into OpenGL and bind it to some geometry, however, when it comes to "selecting" certain tiles(Without unbinding said texture) it seems to require a different kind of Maths than the traditional x,y,width,height approach.

If I wanted to select the Tile at 64,128(coordinates in pixel-space)for the current tile, I would use texture coordinates that mirrored that ideal instead of these weird fractions that I have seen people suggesting on other websites.

It seems that OpenGL does not use pixel-space at all when binding textures, or perhaps I am misunderstanding some basic concepts here; I am not sure.

Here is a simple way to render a 32 by 32 tile at some arbitary location(Will remain at 0,0 for this example):

            int spriteX = 0;
            int spriteY = 0;
            int spriteWidth = 32;
            int spriteHeight = 32;
            GL.BindTexture( TextureTarget.Texture2D , texture );
            GL.Begin( PrimitiveType.Quads );
            {
                GL.TexCoord2( 0 , 1 );
                GL.Vertex2( spriteX , spriteY );
                GL.TexCoord2( 1 , 1 );
                GL.Vertex2( spriteX + spriteWidth , spriteY );
                GL.TexCoord2( 1 , 0 );
                GL.Vertex2( spriteX + spriteWidth , spriteY + spriteHeight );
                GL.TexCoord2( 0 , 0 );
                GL.Vertex2( spriteX , spriteY + spriteHeight );
            }
            GL.End();

Before anyone complains about immediate mode: I am fully aware that it is deprecated; I do not plan on using it for any kind of finished product.

Instead of mapping the drawn quad with the entire texture(which the above code is doing) how would I instead, tell it to only map a region of the image, say: X 64, Y 128,Width 32, Height 32.

1

1 Answers

6
votes

The most direct approach is to use what you call "these weird fractions". They're not really that weird. They're just... fractions.

Say if your whole texture atlas is 1024x1024, and you want a texture with your specified dimensions (X 64, Y 128, Width 32, Height 32), the texture coordinates are:

left: X / 1024 = 64.0f / 1024.0f
right: (X + Width) / 1024 = 96.f / 1024.0f
top = Y / 1024 = 128.0f / 1024.0f
bottom = (Y + Height) / 1024 = 160.0f / 1024.f

An alternative is that you specify a transformation for the texture coordinates, which in this case would be a scaling transformation. With the fixed pipeline, it would look like this:

glMatrixMode(GL_TEXTURE);
glScalef(1.0f / 1024.0f, 1.0f / 1024.0f, 1.0f);
glMatrixMode(GL_MODELVIEW);

Then you can specify your texture coordinates using the pixel positions within the texture atlas.

With the programmable pipeline, the above is obsolete. But you can easily apply the same type of scaling by passing a uniform value to the shader, that you then multiply with the texture coordinates.

In the vertex shader, it could look like this:

uniform vec2 TexCoordScale;
in vec2 TexCoord;
out vec2 FragTexCoord;
...
    FragTexCoord = TexCoordScale * TexCoord;

Then in the fragment shader, you have the matching FragTexCoord in variable, and use it for your texture sampling operation.

In the client code, you set the uniform:

GLint texCoordScaleLoc = glGetUniformLocation(program, "TexCoordScale");
glUniform2f(texCoordScaleLoc, 1.0f / 1024.0f, 1.0f / 1024.0f);

and set up the vertex attribute for the texture coordinates just like you normally would, except that you can now use pixel coordinates for them instead of "weird fractions".