3
votes

I'm trying to provide access to a 3-dimensional array of scalar data to a fragment shader, from a Python program using PyOpenGL.

In the fragment shader, I declare the a 3d sampler uniform

uniform sampler3D vol;

and in the Python program I have the following code to set up a scalar 3d texture

vol = numpy.random.rand(3, 3, 3).astype(np.float32)
texture = glGenTextures(1)
glUniform1i(glGetUniformLocation(program, "vol"), 0)
glActiveTexture(GL_TEXTURE0 + 0)
glBindTexture(GL_TEXTURE_3D, texture)
glTexImage3D(GL_TEXTURE_3D, 0, GL_RED, 3, 3, 3, 0, GL_RED, GL_FLOAT, vol)
glEnable(GL_TEXTURE_3D)

However, no matter where from the sampler I take values, e.g.

color = texture(vol, vec3(0, 0, 0));

it appears that I always obtain black (0, 0, 0).

What am I doing wrong?

I know that the basic setup of my fragment shader works, i.e. if I write color = vec3(1, 0, 0) I get red pixels.

I also know that there are no OpenGL errors, because I'm running the program with the option -glerror processed by glutInit(), which leads to OpenGL errors being translated into Python exceptions.

2

2 Answers

2
votes

That is because your GL_RED texture format is clamped to range <0,1> !!!

To remedy you need to use non clamped texture format or disable clamping ... Here examples that are working on my GL implementations:

here formats extracted from both:

glTexImage3D(GL_TEXTURE_3D, 0, GL_R16F, xs,ys,zs, 0, GL_RED, GL_FLOAT, dat);

glTexImage3D(GL_TEXTURE_3D, 0, GL_RGBA8, size, size, size, 0, GL_RGBA, GL_UNSIGNED_BYTE, pdata);

For scalar data I would use the first option. There are more formats that are not clamped just try and see...

I have never used the disabling of clamping feature but saw this code somewhere while researching similar issues (not sure if it works):

glClampColorARB(GL_CLAMP_VERTEX_COLOR_ARB, GL_FALSE);
glClampColorARB(GL_CLAMP_READ_COLOR_ARB, GL_FALSE);
glClampColorARB(GL_CLAMP_FRAGMENT_COLOR_ARB, GL_FALSE);

With that theoretically you could use any texture format...

To verify you can use this:

Also I do not see any parameters of the texture set. I would expect something like this:

glBindTexture(GL_TEXTURE_3D,txrvol);
glPixelStorei(GL_UNPACK_ALIGNMENT, 4);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_R,GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MIN_FILTER,GL_NEAREST);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE,GL_MODULATE);

to avoid interpolation messing your data for non exact texture coordinates ...

1
votes

I figured out the problem: Apparently GL_TEXTURE_3D is by default mipmapped, I only provided level 0, and (how I'm not clear about) another level is selected. The problem is solved by glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAX_LEVEL, 0).