I'm using an image in a canvas element as a texture in Three.js, performing image manipulations on the canvas using JavaScript, and then calling needsUpdate() on the texture. This works, but it's quite slow.
I'd like to perform the image calculations in a fragment shader instead. I've found many examples which almost do this:
Shader materials: http://mrdoob.github.io/three.js/examples/webgl_shader2.html This example shows image manipulations performed in a fragment shader, but that shader is functioning as the fragment shader of an entire material. I only want to use the shader on a texture, and then use the texture as a component of a second material.
Render to texture: https://threejsdoc.appspot.com/doc/three.js/examples/webgl_rtt.html This shows rendering the entire scene to a WebGLRenderTarget and using that as the texture in a material. I only want to pre-process an image, not render an entire scene.
Effects composer: http://www.airtightinteractive.com/demos/js/shaders/preview/ This shows applying shaders as a post-process to the entire scene.
Edit: Here's another one:
- Render to another scene: http://relicweb.com/webgl/rt.html This example, referenced in Three.js Retrieve data from WebGLRenderTarget (water sim), uses a second scene with its own orthographic camera to render a dynamic texture to a WebGLRenderTarget, which is then used as a texture in the primary scene. I guess this is a special case of the first "render to texture" example listed above, and would probably work for me, but seems over-complicated.
As I understand it, ideally I'd be able to make a new framebuffer object with its own fragment shader, render it on its own, and use its output as a texture uniform for another material's fragment shader. Is this possible?
Edit 2: It looks like I might be asking something similar to this: Shader Materials and GL Framebuffers in THREE.js ...though the question doesn't appear to have been resolved.
ShaderMaterial
, and then use that texture as a uniform for a secondShaderMaterial
. – WestLangley