2
votes

Using three.js, the classes OculusRiftEffect, VREffect or VRRenderer allow a programmer to render their scene to an Oculus Rift.

EffectComposer is another three.js class that lets the programmer compose multiple different scenes to be displayed by single renderer.

My question is, how can I display the composed output of EffectComposer with the Rift?

The problem is as follows:

The OculusRiftEffect, VREffect or VRRenderer class must be initialised with a renderer such as WebGLRenderer. In the render loop, the class must be called as follows, causing the scene to be displayed on the Rift:

this.vrrenderer.render(this.threeScene, this.camera);

The EffectComposer must also be initialised with a renderer such as WebGLRenderer. In the render loop, the EffectComposer must be called as follows, causing the composed scene to be displayed by the renderer:

this.composer.render();

However, the EffectComposer can't be initialised with an OculusRiftEffect, VREffect or VRRenderer in place of the WebGLRenderer.

The question is how to connect the EffectComposer to one of the Rift classes for rendering?

Many thanks!

1
reddit thread on the same topic: reddit.com/r/threejs/comments/351pdu/…jimr

1 Answers

1
votes

I was able to integrate EffectComposer with StereoEffect, altering it to a new class called StereoCamera. See here for a similar question answered:

Three.js combining StereoEffect with FXAA ShaderPass