I have a scene in Three (using AFrame) that requires environment mapping for lighting. Using a standard HDR cubemap, I get the following results:
This is correct as far as blurring based on roughness goes, since I have mipmaps being generated and the minFilter set to LinearMipmapLinearFilter. The issue with this approach is that ambient lighting isn't being applied - the light in the scene, a directional light, is the only thing providing any lighting information to the scene. Unfortunately, this results in entirely black shadows, no matter how bright the HDRI.
However, if I use the PMREMGenerator from Three in addition to the above, the ambient lighting issue is solved. Unfortunately, this is what happens as a result:
As shown here, the texture filtering is now out of whack. According to comments left in the PMREMGen script itself.
This class generates a Prefiltered, Mipmapped Radiance Environment Map (PMREM) from a cubeMap environment texture. This allows different levels of blur to be quickly accessed based on material roughness. It is packed into a special CubeUV format that allows us to perform custom interpolation so that we can support nonlinear formats such as RGBE. Unlike a traditional mipmap chain, it only goes down even more filtered 'mips' at the same LOD_MIN resolution, associated with higher roughness levels. In this way we maintain resolution to smoothly interpolate diffuse lighting while limiting sampling computation.
...which leads me to believe the output should be smoothed, like in my first example.
Here's my code for the first example:
const ctl = new HDRCubeTextureLoader();
ctl.setPath(hdrPath);
ctl.setDataType(THREE.UnsignedByteType);
const hdrUrl = [
`${src}/px.hdr`,
`${src}/nx.hdr`,
`${src}/py.hdr`,
`${src}/ny.hdr`,
`${src}/pz.hdr`,
`${src}/nz.hdr`
];
const hdrSky = ctl.load(hdrUrl, tex => this.skies[src] = hdrSky );
// Then later...
obj.material.envMap = this.skies[this.skySources[index]];
obj.material.needsUpdate = true;
Here's my code for the second example:
const ctl = new HDRCubeTextureLoader();
ctl.setPath(hdrPath);
ctl.setDataType(THREE.UnsignedByteType);
const hdrUrl = [
`${src}/px.hdr`,
`${src}/nx.hdr`,
`${src}/py.hdr`,
`${src}/ny.hdr`,
`${src}/pz.hdr`,
`${src}/nz.hdr`
];
const pmremGen = new PMREMGenerator(this.el.sceneEl.renderer);
pmremGen.compileCubemapShader();
const hdrSky = ctl.load(hdrUrl, tex => {
const hdrRenderTarget = pmremGen.fromCubemap(hdrSky);
this.skies[src] = hdrRenderTarget.texture;
});
// Then later...
obj.material.envMap = this.skies[this.skySources[index]];
obj.material.needsUpdate = true;
I seem to have hit a wall in regard to the filtering. Even when I explicitly change the filtering type and turn on mipmap generation inside of PMREMGenerator.js, the results appear to be the same. Here's an official example that uses the PMREMGenerator without any issue: https://threejs.org/examples/webgl_materials_envmaps_hdr.html
As a closing remark I will note that we're using Three.js r111 (and there are reasons we can't switch it out fully), so I brought in the current version of the PMREMGenerator from the latest version of Three as of this writing (r122 - later version needed as the r111 one is written completely differently). Thus, I wouldn't be surprised if this was all being caused by some conflict between versions.
EDIT: Just put the resulting envmap as a standard map on some planes, and much to my surprise none of the blurred LODS even show up. Here's what mine look like:
And here's what it should resemble (don't mind the torus knot):
EDIT: I've found a workaround for now (essentially, not using PMREMGenerator), but will leave this up in case a solution is discovered.