1
votes

So if any THREE.js pros can understand why I can't get the WebGLRenderTarget to be used as a material for a plane in another scene I'd be pretty happy.

How it works right now is I create a scene with a perspective camera which renders a simple plane. This happens in the Application object.

I also have a WaveMap object that uses another scene and an orthographic camera and using a fragment shader draws the cos(x) * sin(y) function on another quad that takes up the entire screen. I render this to a texture and then I create a material that uses this texture.

I then pass that Material to be used in the Application object to draw the texture on the first plane I mentioned.

Problem is for some reason I could get this to work in the scene with the orthographic camera inside the WaveMap object but not in the scene with the perspective camera in the Application object after passing the material over. :(

I've tried simply passing a simple material with a solid color and that works but when I try to pass over a material which uses a WebGLRenderTarget as a texture it doesn't show up anymore.

https://github.com/ArminTaheri/rendertotexture-threejs

1
I just found this question stackoverflow.com/questions/32441264/… I did not find it while vigorously looking for a solution.Armin Taheri

1 Answers

7
votes

You need to clone any material/texture that you want to render by two different renderers.

var materialOnOther = originalMaterial.clone();

Prior to r72 you needed to force the image buffers for the textures to be updated like so:

materialOnOther.uniforms.exampleOfATexture.value.needsUpdate = true;