I'm trying to read a texture that's been loaded with floating-point values using my generic code for reading images, which tries to read it out as (8-bit) integer values.
I was expecting it to clamp the values to 0..1 and to basically make it look pretty horrible, but still recognizable. Instead, it's completely black, as if it read all black values.
The texture is GL_R32F format and I'm trying to read it as GL_UNSIGNED_BYTE with GL_RGBA as output.
Should this work? If not, what kind of thing should I be looking for?