3
votes

I have been working through WebGL tutorials like webglfundamentals and have run into a stumbling point - I believe that I will need to use a texture that I create to pass information directly to the fragment shader, but I can't seem to index the texture properly.

The goal is to pass information about light sources (location and color) that will be factored into the fragment color. Ideally this information is dynamic in both value and length.

Reproduction

I've created a simplified version of the problem in this fiddle: WebGL - Data Texture Testing

Here's some of the code.

In a one-time setup we create a texture, fill it with data, and apply what seem to be the most fool-proof settings on that texture (no mips, no byte packing issues[?])

  // lookup uniforms
  var textureLocation = gl.getUniformLocation(program, "u_texture");

  // Create a texture.
  var texture = gl.createTexture();
  gl.bindTexture(gl.TEXTURE_2D, texture);

  // fill texture with 1x3 pixels
  const level = 0;
  const internalFormat = gl.RGBA; // I've also tried to work with gl.LUMINANCE
  //   but it feels harder to debug
  const width = 1;
  const height = 3;
  const border = 0;
  const type = gl.UNSIGNED_BYTE;
  const data = new Uint8Array([
    // R,   G,   B, A (unused)    // : 'texel' index (?)
    64, 0, 0, 0, // : 0
    0, 128, 0, 0, // : 1
    0, 0, 255, 0, // : 2
  ]);
  const alignment = 1; // should be uneccessary for this texture, but 
  gl.pixelStorei(gl.UNPACK_ALIGNMENT, alignment); //   I don't think this is hurting
  gl.texImage2D(gl.TEXTURE_2D, level, internalFormat, width, height, border,
    internalFormat, type, data);

  // set the filtering so we don't need mips and it's not filtered
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);

In the draw sequence (which only happens once but conceivably could repeat) we reinforce that the program should use our texture

    // Tell the shader to use texture unit 0 for u_texture
    gl.activeTexture(gl.TEXTURE0);                  // added this and following line to be extra sure which texture is being used...
    gl.bindTexture(gl.TEXTURE_2D, texture);
    gl.uniform1i(textureLocation, 0);

Finally, in the fragment shader, we are trying to just reliably use one 'texel' as a means of conveying information. I can't seem to make heads or tails of how to retrieve the values that I stored in the texture reliably.

precision mediump float;

// The texture.
uniform sampler2D u_texture;

void main() {

    vec4 sample_00 = texture2D(u_texture, vec2(0, 0)); 
    // This sample generally appears to be correct. 
    // Changing the provided data for the B channel of texel
    //   index 0 seems to add blue as expected

    vec4 sample_01 = texture2D(u_texture, vec2(0, 1));
    vec4 sample_02 = texture2D(u_texture, vec2(0, 2));
    // These samples are how I expected this to work since 
    //   the width of the texture is set to 1
    // For some reason 01 and 02 both show the same color

    vec4 sample_10 = texture2D(u_texture, vec2(1, 0));
    vec4 sample_20 = texture2D(u_texture, vec2(2, 0));
    // These samples are just there for testing - I don't think
    //   that they should work

    // choose which sample to display
    vec4 sample = sample_00;
    gl_FragColor = vec4(sample.x, sample.y, sample.z, 1);
}

Question(s)

Is using a texture the best way to do this? I have heard of the ability to pass arrays of vectors as well, but textures seem to be more common.

How are you supposed to create the texture? (particularly when I specify 'width' and 'height' should I be referring to resulting texel dimensions or the number of gl.UNSIGNED_BYTE elements that I will use to construct the texture?? texImage2D documentation)

How are you supposed to index into the texture in a fragment shader when not using 'varying' types? (i.e. I just want the value of one or more particular texels - no interpolation [having little to nothing to do with vertices])

Other Resources

I've read as much as I can about this for the time being. Here's a non-exhaustive list:

Edit Here's another resource I just found: Hassles with array access in WebGL, and a couple of workarounds. Makes me hopeful.

This is really bugging me.

Thanks in advance!

2
Why not pass the the lights as an array to a uniforms uniform vec3 lights[4]; Also looking up texture pixels is via unit coords not pixel coords. eg texture2D(tex, vec2(0,2)) should be texture2D(tex, vec2(0,0.5))Blindman67

2 Answers

1
votes

Use uniform arrays

The linked article is somewhat old. WebGL has support for uniform arrays via uniform[1234][if]v which has good browser support

It is not a good idea to use a texture to store floating point data when you can simply use a uniform array to hold dynamic data such as lights.

The example bellow using webGL (NOT webGL2 which is the better option) has two uniform arrays

// define number of lights
#define LIGHTS 5
uniform vec3 lights[LIGHTS];
uniform vec3 lightCols[LIGHTS];

They are set via gl.uniform3fv. Note that the v stands for vector (C language array)

This will save you having to send redundant data (the alpha byte) and work with floating point precision rather than texture bytes (or via extension you can have floating point textures)

Also you can send only the array items that change rather than all the data each time.

const LIGHTS = 15; // Number of lights. MUST BE!!!! INT >= 1
const EXPOSURE = 1; // Scaler on final pixel colour
const shaders = {
    get locations() { return ["A_vertex","U_lights","U_lightCols"] },  // Prefix A_ or U_ used by JS name is after _
    get vertex() { return `
attribute vec2 vertex;
varying vec2 map;
void main() {
	map = vertex * 0.5 + 0.5; // map unit 0-1 over quad
	gl_Position = vec4(vertex, 0, 1);
}`;
    },
    get fragment() { return `
precision mediump float;
#define LIGHTS ${LIGHTS}
#define EXPOSURE ${EXPOSURE.toFixed(1)}
uniform vec3 lights[LIGHTS];
uniform vec3 lightCols[LIGHTS];
varying vec2 map;
void main() {
    vec3 light = vec3(0);
    for (int i = 0; i < LIGHTS; i++) {
        float level = pow(1.0-length(lights[i] - vec3(map, 0)), 4.0);
        light += lightCols[i] * level ;
    }
    gl_FragColor = vec4(light * EXPOSURE, 1.0);
}`;
    }
};
const gl = canvas.getContext("webgl", {alpha: false, depth: false, premultpliedAlpha: false, preserveDrawingBufer: true});
var shader, frame = 0; 
function resize() { 
    if (canvas.width !== innerWidth || canvas.height !== innerHeight) {
        canvas.width = innerWidth;
        canvas.height = innerHeight;
        shader.gl.viewport(0, 0, canvas.width,canvas.height);
    }
}
requestAnimationFrame(update);
function update(timer){ // Main update loop
    frame += 1;
    if (shader === undefined) {
        shader = createRender(gl);
    }
    resize();
    gl.useProgram(shader.program);
    orbit(shader, frame);
    shader.gl.drawArrays(gl.TRIANGLES, 0, 6);
    requestAnimationFrame(update);
}


function orbit(shader, frame) {
    var i = LIGHTS;
    frame /= 100;
    const l = shader.lightsBuf, lp = shader.lights, c = shader.lightColsBuf, cp = shader.lightCols;
    while (i--) {
        const idx = i * 3, r = 0.1 + i / LIGHTS;
        l[idx    ] = lp[idx    ] + Math.cos(frame) * r * Math.sin(frame/2);
        l[idx + 1] = lp[idx + 1] + Math.sin(frame) * r * Math.sin(frame/2);
        l[idx + 2] = lp[idx + 2] + Math.cos(frame/2) * r;
        c[idx    ] = cp[idx    ] + (Math.sin(frame/3) * 0.5) * Math.cos(frame);
        c[idx + 1] = cp[idx + 1] + (Math.cos(frame/3)* 0.5) * Math.cos(frame);
        c[idx + 2] = cp[idx + 2] + Math.sin(frame) * 0.5;
        frame *= 1.2;
    }
    shader.gl.uniform3fv(shader.locations.lights, l, 0, LIGHTS * 3);
    shader.gl.uniform3fv(shader.locations.lightCols, c, 0, LIGHTS * 3);
}
    


function createShader(gl, vertSrc, fragSrc, locDesc) {
    var locations = {};
    const program = gl.createProgram();
    const vShader = gl.createShader(gl.VERTEX_SHADER);
    const fShader = gl.createShader(gl.FRAGMENT_SHADER);
    gl.shaderSource(vShader, vertSrc);
    gl.shaderSource(fShader, fragSrc);
    gl.compileShader(vShader);
    if (!gl.getShaderParameter(vShader, gl.COMPILE_STATUS)) { throw new Error(gl.getShaderInfoLog(vShader)) } 
    gl.compileShader(fShader);
    if (!gl.getShaderParameter(fShader, gl.COMPILE_STATUS)) { throw new Error(gl.getShaderInfoLog(fShader)) } 
    gl.attachShader(program, vShader);
    gl.attachShader(program, fShader);
    gl.linkProgram(program);
    if (!gl.getProgramParameter(program, gl.LINK_STATUS)) { throw new Error(gl.getProgramInfoLog(program)) } 
    gl.useProgram(program);
    for(const desc of locDesc) {
        const [type, name] = desc.split("_");
        locations[name] = gl[`get${type==="A" ? "Attrib" : "Uniform"}Location`](program, name);
    }
    return {program, locations, gl};
}

function createRender(gl) {
    const shader = createShader(gl, shaders.vertex, shaders.fragment, shaders.locations);
    gl.bindBuffer(gl.ARRAY_BUFFER, gl.createBuffer());
    gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([-1,-1, 1,-1, -1,1, -1,1, 1,-1, 1,1]), gl.STATIC_DRAW);
    gl.enableVertexAttribArray(shader.locations.vertex);
    gl.vertexAttribPointer(shader.locations.vertex, 2, gl.FLOAT, false, 0, 0);
    shader.lightsBuf = new Float32Array(LIGHTS * 3);
    shader.lightColsBuf = new Float32Array(LIGHTS * 3);
    const lights = shader.lights = new Float32Array(LIGHTS * 3);
    const cols = shader.lightCols = new Float32Array(LIGHTS * 3);
    var i = LIGHTS * 3;
    while (i--) { (cols[i] = Math.random(),lights[i] = Math.random())}
    return shader;
}
body {
  padding: 0px;
}

canvas {
  position: absolute;
  top: 0px;
  left: 0px;
}
<canvas id="canvas"></canvas>
4
votes

Addressing individual pixels in a texture in WebGL1 uses this formula

vec2 pixelCoord = vec2(x, y);
vec2 textureDimension = vec2(textureWidth, textureHeight)
vec2 texcoord = (pixelCoord + 0.5) / textureDimensions;
vec4 pixelValue = texture2D(someSamplerUniform, texcoord);

Because texture coordinates are by edges. If you have a 2x1 texture

1.0+-------+-------+
   |       |       |
   |   A   |   B   |
   |       |       |
0.0+-------+-------+
  0.0     0.5     1.0

The texture coordinate at center of pixel A = 0.25, 0.5. The texture coordinate at the center of pixel B is 0.75, 0.5

If you don't follow the forumla above and use just pixelCoord / textureDimensions then you're pointing in between pixels and math errors will get you one pixel or the other.

Of course if you're using textures for data you also probably want to set gl.NEAREST filtering.

In WebGL2 you can just use texelFetch

ivec2 pixelCoord = ivec2(x, y);
int mipLevel = 0;
vec4 pixelValue = texelFetch(someSamplerUniform, texcoord, mipLevel);

A working example of using textures for data is here

Is using a texture the best way to do this? I have heard of the ability to pass arrays of vectors as well, but textures seem to be more common.

To do what? It wasn't clear what you're trying to do. Is every pixel going to have a different light source?

How are you supposed to create the texture? (particularly when I specify 'width' and 'height' should I be referring to resulting texel dimensions or the number of gl.UNSIGNED_BYTE elements that I will use to construct the texture?? texImage2D documentation)

Do whatever is easiest or required. For example if you have 5 pieces of data per thing you want to pull out I might put each piece of data on a separate row in the texture. I can then do

vec4 datum1 = texture2D(dataTexture, vec2(indexTexCoordX, rowTexCoordY0);
vec4 datum2 = texture2D(dataTexture, vec2(indexTexCoordX, rowTexCoordY1);
vec4 datum3 = texture2D(dataTexture, vec2(indexTexCoordX, rowTexCoordY2);
vec4 datum4 = texture2D(dataTexture, vec2(indexTexCoordX, rowTexCoordY3);

Where indexTexCoordX and rowTexCoordY0-3 are computed from the forumla above. rowTexCoordY0-3 migth even be constants.

Textures have a limit in dimensions though so if you have more data then will fit in one dimension then you'll have to pack the data tighter and do more math to pull it out.

Be aware textures have caches so ideally you want the data you pull out to be near data you previously pulled out. If you every time you jump across the texture for the next value your performance will drop. (though of course it may still be faster then an alternative solution depending on what you're doing)

How are you supposed to index into the texture in a fragment shader when not using 'varying' types? (i.e. I just want the value of one or more particular texels - no interpolation [having little to nothing to do with vertices])

The only changing inputs to a fragment shader are varyings, gl_FragCoord (the coordinate pixel being written to) and gl_PointCoord, only available when drawing POINTS. So you have to use one of those otherwise all other values are constant for all pixels.