2
votes

Short question: How can I pass a list of textures to shaders and access the nth texture within a fragment shader (where n is a value passed as a varying from the vertex shader)?

Longer question: I'm working on a Three.js scene that represents multiple images. Each image uses one of multiple textures, and each texture is an atlas containing several thumbnails. I'm working on implementing custom shaderMaterial to optimize performance, but am confused on how to use multiple textures in the shaders.

My goal is to pass a list of textures and a number that represents the number of vertices per texture so that I can identify the texture that should be used for each image's vertices/pixels. I thought I could accomplish this by passing the following data:

// Create a texture loader so we can load our image file
var loader = new THREE.TextureLoader();

// specify the url to the texture
var catUrl = 'https://s3.amazonaws.com/duhaime/blog/tsne-webgl/assets/cat.jpg';
var dogUrl = 'https://s3.amazonaws.com/duhaime/blog/tsne-webgl/assets/dog.jpg';

var material = new THREE.ShaderMaterial({  
  uniforms: {

    verticesPerTexture: new Float32Array([4.0]), // count of vertices per texture

    textures: {
      type: 'tv', // type for texture array
      value: [loader.load(catUrl), loader.load(dogUrl)],
    }
  },
  vertexShader: document.getElementById('vertex-shader').textContent,
  fragmentShader: document.getElementById('fragment-shader').textContent
});

However, if I do this, the vertex shader can't seem to use the uniforms to tell the fragment shader which texture it should use, as vertex shaders evidently can't pass sampler2d objects as varyings to the fragment shader. How can I pass a list of textures to the shaders?

Full code (which doesn't successfully pass a list of textures):

/**
* Generate a scene object with a background color
**/

function getScene() {
  var scene = new THREE.Scene();
  scene.background = new THREE.Color(0xffffff);
  return scene;
}

/**
* Generate the camera to be used in the scene. Camera args:
*   [0] field of view: identifies the portion of the scene
*     visible at any time (in degrees)
*   [1] aspect ratio: identifies the aspect ratio of the
*     scene in width/height
*   [2] near clipping plane: objects closer than the near
*     clipping plane are culled from the scene
*   [3] far clipping plane: objects farther than the far
*     clipping plane are culled from the scene
**/

function getCamera() {
  var aspectRatio = window.innerWidth / window.innerHeight;
  var camera = new THREE.PerspectiveCamera(75, aspectRatio, 0.1, 1000);
  camera.position.set(0, 1, 10);
  return camera;
}

/**
* Generate the renderer to be used in the scene
**/

function getRenderer() {
  // Create the canvas with a renderer
  var renderer = new THREE.WebGLRenderer({antialias: true});
  // Add support for retina displays
  renderer.setPixelRatio(window.devicePixelRatio);
  // Specify the size of the canvas
  renderer.setSize(window.innerWidth, window.innerHeight);
  // Add the canvas to the DOM
  document.body.appendChild(renderer.domElement);
  return renderer;
}

/**
* Generate the controls to be used in the scene
* @param {obj} camera: the three.js camera for the scene
* @param {obj} renderer: the three.js renderer for the scene
**/

function getControls(camera, renderer) {
  var controls = new THREE.TrackballControls(camera, renderer.domElement);
  controls.zoomSpeed = 0.4;
  controls.panSpeed = 0.4;
  return controls;
}

/**
* Load image
**/

function loadImage() {

  var geometry = new THREE.BufferGeometry();

  /*
  Now we need to push some vertices into that geometry to identify the coordinates the geometry should cover
  */

  // Identify the image size
  var imageSize = {width: 10, height: 7.5};

  // Identify the x, y, z coords where the image should be placed
  var coords = {x: -5, y: -3.75, z: 0};

  // Add one vertex for each corner of the image, using the 
  // following order: lower left, lower right, upper right, upper left
  var vertices = new Float32Array([
    coords.x, coords.y, coords.z, // bottom left
    coords.x+imageSize.width, coords.y, coords.z, // bottom right
    coords.x+imageSize.width, coords.y+imageSize.height, coords.z, // upper right
    coords.x, coords.y+imageSize.height, coords.z, // upper left
  ])

  // set the uvs for this box; these identify the following corners:
  // lower-left, lower-right, upper-right, upper-left
  var uvs = new Float32Array([
    0.0, 0.0,
    1.0, 0.0,
    1.0, 1.0,
    0.0, 1.0,
  ])

  // store the texture index of each object to be rendered
  var textureIndices = new Float32Array([0.0, 0.0, 0.0, 0.0]);

  // indices = sequence of index positions in `vertices` to use as vertices
  // we make two triangles but only use 4 distinct vertices in the object
  // the second argument to THREE.BufferAttribute is the number of elements
  // in the first argument per vertex
  geometry.setIndex([0,1,2, 2,3,0])
  geometry.addAttribute('position', new THREE.BufferAttribute(vertices, 3));
  geometry.addAttribute('uv', new THREE.BufferAttribute(uvs, 2));

  // Create a texture loader so we can load our image file
  var loader = new THREE.TextureLoader();

  // specify the url to the texture
  var catUrl = 'https://s3.amazonaws.com/duhaime/blog/tsne-webgl/assets/cat.jpg';
  var dogUrl = 'https://s3.amazonaws.com/duhaime/blog/tsne-webgl/assets/dog.jpg';

  // specify custom uniforms and attributes for shaders
  // Uniform types: https://github.com/mrdoob/three.js/wiki/Uniforms-types
  var material = new THREE.ShaderMaterial({  
    uniforms: {

      verticesPerTexture: new Float32Array([4.0]), // store the count of vertices per texture

      cat_texture: {
        type: 't',
        value: loader.load(catUrl),
      },

      dog_texture: {
        type: 't',
        value: loader.load(dogUrl),
      },

      textures: {
        type: 'tv', // type for texture array
        value: [loader.load(catUrl), loader.load(dogUrl)],
      }
    },
    vertexShader: document.getElementById('vertex-shader').textContent,
    fragmentShader: document.getElementById('fragment-shader').textContent
  });

  // Combine our image geometry and material into a mesh
  var mesh = new THREE.Mesh(geometry, material);

  // Set the position of the image mesh in the x,y,z dimensions
  mesh.position.set(0,0,0)

  // Add the image to the scene
  scene.add(mesh);
}

/**
* Render!
**/

function render() {
  requestAnimationFrame(render);
  renderer.render(scene, camera);
  controls.update();
};

var scene = getScene();
var camera = getCamera();
var renderer = getRenderer();
var controls = getControls(camera, renderer);
loadImage();

render();
html, body { width: 100%; height: 100%; background: #000; }
body { margin: 0; overflow: hidden; }
canvas { width: 100%; height: 100%; }
<script src='https://cdnjs.cloudflare.com/ajax/libs/three.js/92/three.min.js'></script>
<script src='https://threejs.org/examples/js/controls/TrackballControls.js'></script>

<script type='x-shader/x-vertex' id='vertex-shader'>
  /**
  * The vertex shader's main() function must define `gl_Position`,
  * which describes the position of each vertex in the space.
  *
  * To do so, we can use the following variables defined by Three.js:        
  *   
  *   uniform mat4 modelViewMatrix - combines:
  *     model matrix: maps a point's local coordinate space into world space
  *     view matrix: maps world space into camera space
  *
  *   uniform mat4 projectionMatrix - maps camera space into screen space
  *
  *   attribute vec3 position - sets the position of each vertex
  *
  *   attribute vec2 uv - determines the relationship between vertices and textures
  *
  * `uniforms` are constant across all vertices
  *
  * `attributes` can vary from vertex to vertex and are defined as arrays
  *   with length equal to the number of vertices. Each index in the array
  *   is an attribute for the corresponding vertex
  *
  * `varyings` are values passed from the vertex to the fragment shader
  *
  * Specifying attributes that are not passed to the vertex shader will not pevent shader compiling
  **/

  // declare uniform vals
  uniform float verticesPerTexture; // store the vertices per texture

  // declare variables to pass to fragment shaders
  varying vec2 vUv; // pass the uv coordinates of each vertex to the frag shader
  varying float textureIndex; // pass the texture idx

  // initialize counters
  float vertexIdx = 0.0; // stores the index position of the current vertex
  float textureIdx = 1.0; // store the index position of the current texture

  void main() {
    // keep track of which texture each vertex belongs to
    vertexIdx = vertexIdx + 1.0;
    if (vertexIdx == verticesPerTexture) {
      textureIdx = textureIdx + 1.0;
      vertexIdx = 0.0;
    }

    vUv = uv;
    gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
  }
</script>

<script type='x-shader/x-fragment' id='fragment-shader'>
  /**
  * The fragment shader's main() function must define `gl_FragColor`,
  * which describes the pixel color of each pixel on the screen.
  *
  * To do so, we can use uniforms passed into the shader and varyings
  * passed from the vertex shader
  *
  * Attempting to read a varying not generated by the vertex shader will
  * throw a warning but won't prevent shader compiling
  *
  * Each attribute must contain n_vertices * n_components, where n_components
  * is the length of the given datatype (e.g. vec2 n_components = 2;
  * float n_components = 1)
  **/

  precision highp float; // set float precision (optional)

  varying vec2 vUv; // identify the uv values as a varying attribute
  varying float textureIndex; // identify the texture indices as a varying attribute

  uniform sampler2D cat_texture; // identify the texture as a uniform argument
  uniform sampler2D dog_texture; // identify the texture as a uniform argument
  //uniform sampler2D textures;

  // TODO pluck out textures[textureIndex];
  //uniform sampler2D textures[int(textureIndex)];

  void main() {
    int textureIdx = int(textureIndex);

    // float point arithmetic prevents strict equality checking
    if ( (textureIndex - 1.0) < 0.1 ) {
      gl_FragColor = texture2D(cat_texture, vUv);
    } else {
      gl_FragColor = texture2D(dog_texture, vUv);
    }
  }
</script>

  
2
I feel like the data structure I'm after is the sampler1DArray (khronos.org/opengl/wiki/Sampler_(GLSL)) but I can't find any examples of this data structure in Three.jsduhaime
The chief difficulty here revolves around initializing the array of sampler2D values in the fragment shader -- how does one identify the type of an array of sampler2D values in GLSL?duhaime
This is very close; if only line 85 were allowed: bl.ocks.org/duhaime/6c7cb04962d7b242757c75f5fbcc9607duhaime

2 Answers

1
votes

Having slept on it, here's another method you can try, more akin to how you'd do this with built-in materials:

function createMaterial ( texture ) {
    return new ShaderMaterial({
        uniforms: {
            texture: { value: texture }
        }
    })
}

var mat1 = createMaterial( dogTexture );
var mat2 = createMaterial( catTexture );

geometry.faces[ 0 ].materialIndex = 0;
geometry.faces[ 1 ].materialIndex = 0;
geometry.faces[ 2 ].materialIndex = 1;
geometry.faces[ 3 ].materialIndex = 1;

var mesh = new Mesh( geometry, [ mat1, mat2 ] );
0
votes

You’ve written the vertex shader as if main is a for loop and it will iterate through all the vertices and update vertexIdx and textureIdx as it goes along, but that’s not how shaders work. Shaders run in parallel, processing every vertex at the same time. So you can’t share what the shader computes about one vertex with another vertex.

Use an attribute on the geometry instead:

geometry.addAttribute( 'texIndex', new THREE.BufferAttribute( [ 0, 0, 0, 0, 1, 1, 1, 1 ], 1 ) )

I’m getting a little out of my depth here but I think you then pass it through the vertex shader to a varying:

attribute int texIndex;
varying int vTexIndex;
void main () { vTexIndex = texIndex; }

Finally, in the fragment shader:

varying int vTexIndex;
uniform sampler2D textures[ 2 ];
...
sampler2D tex = textures[ vTexIndex ];