3
votes

I'm trying to implement a path tracer using THREE.js. I'm basically rendering a fullscreen quad and the path tracing happens in the pixel shader.

I want a higher sampling rate and one way to do so is to sample one path for each pixel and accumulate the resulting images. (i.e. averaging out the images obtained at each shader pass)

As it is I am able to generate the images I need but I have no idea how to accumulate them. My guess would be that I have to use two render targets; one would contain the "latest" sampled image and one would contain the average of all the images displayed so far.

I just don't know how to get the data from a WebGLRenderTarget and use it to manipulate the data contained in another render target. Is this even something possible that's possible with Three.js? I've been looking into FrameBuffer Objects and this seems like a promising path and I am combing through MrDoob's FBO example (http://www.mrdoob.com/lab/javascript/webgl/particles/particles_zz85.html) and it appears promising but I'm not sure I'm headed down the right path.

1
See threejs.org/examples/webgl_gpgpu_birds.html. It shows an example of ping-ponging between two render targets.WestLangley

1 Answers

3
votes

I think the issue is that you can't read and write from the same buffer. Say you render something one frame, you need a pass that outputs to the accum buffer. The next frame, you need to do your calcs, and save to the same buffer but if I understand correctly this is not possible with WebGL currently.

What you can do instead is have two buffers. In the shader where you output your stuff to the buffer and do your calcs, just add another texture sampler, read the one from the previous frame, save to the next one, and then alternate. You will always have your accumulated values, you can use whatever math you want for the addition, but you need to make sure that you are reading the right buffer at the right frame.

three.js has a plugin for post processing and this should be very handy for doing stuff like this.

var flipFlop = true;
var buffer1 = THREE.WebGLRenderTarget(BUFFERSIZE, BUFFERSIZE, {minFilter: THREE.LinearFilter, magFilter: THREE.LinearFilter, format: THREE.RGBAFormat, type: THREE.FloatType});
var buffer2 = THREE.WebGLRenderTarget(BUFFERSIZE, BUFFERSIZE, {minFilter: THREE.LinearFilter, magFilter: THREE.LinearFilter, format: THREE.RGBAFormat, type: THREE.FloatType});


function render() {
    // If we are in frame1 read buffer from frame0 and add it to whatever you compute
    yourComputeShaderMaterial.uniforms._accumulationBuffer.value = flipFlop ? buffer2 : buffer1;

    if (flipFlop) // Frame 0
        renderer.render(scene, camera, buffer1);
    else // Frame 1
        renderer.render(scene, camera, buffer2); 

    // Get whatever just renderered in this frame and use it
    yourEndShader.uniforms._accumulationBuffer.value = !flipFlop ? buffer2 : buffer1;

    // Switch frame
    flipFlop = !flipFlop;
}