4
votes

OpenGL ES 2.0 has support for 3D textures via extensions. Unfortunately this type of extension is not supported on many devices. So what I'm trying to do is use 2D textures instead of 3D textures. Firstly, I've fetched the 3D texture data to an atlas of 2D textures. For example , instead of having a 3D texture with 128x128x4 I will have a 2D texture atlas that contains 4 2D textures (128x128). The fragment shader will look something like this:

precision mediump float;
uniform sampler2D s_texture;
uniform vec2 2DTextureSize;
uniform vec3 3DTextureSize;
varying vec3 texCoords;
vec2 To2DCoords(vec3 coords)
{
 float u = coords.x + 3DTextureSize.x*(coords.z - 2DTextureSize.x *floor(coords.z/2DTextureSize.x));
 float v = coords.y + 3DTextureSize.y*floor(coords.x/2DTextureSize.x);
 return vec2(u,v);
}
void main()
{
  gl_FragColor = texture2D(s_texture,To2DCoords(texCoords));
}

The method To2DCoords is inspired by an algorithm found at https://developer.nvidia.com/gpugems/GPUGems3/gpugems3_ch29.html.

The problem is that at render everything is messed up, compared with the 3D texture. So what am I doing wrong?

1

1 Answers

0
votes

According to your code, the input and output of the To2DCoords() should be in pixel coordinates(0~255), not in texture coordinates(0~1.0), when the texture size is 256x256, for example.

Your code should look like:

gl_FragColor = texture2D(s_texture,To2DCoords(texCoords*3DTextureSize)/2DTextureSize);