3
votes

I'm trying to implement a simple GPU picker in three.js using the MeshDepthMaterial. I've managed to extract the color value using the following example: https://github.com/mrdoob/three.js/blob/master/examples/webgl_interactive_cubes_gpu.html

And by porting the unpackRGBAToDepth function from here to javascript to reconstruct the depth:

https://github.com/mrdoob/three.js/blob/acdda10d5896aa10abdf33e971951dbf7bd8f074/src/renderers/shaders/ShaderChunk/packing.glsl

But the value comes out as a float between 0.0 and 255.0 ( I was expecting it to be between 0.0 and 1.0 or the actual depth). My question follows: how can I use this value, can it be translated to the actual depth? If so, how?

1
My implementation is quite close to the stackoverflow answer, though i'm using a logarithmic depth buffer so it differs slightly. I created an example: jsfiddle.net/1x4Lk8w6/1 where i'm trying to get the actual depth but it ends up being always 1 if there is something under the mouse.Avocher
check out the effects as you change the far plane of your camera. 10000 is pretty big.gaitat
The value remains 1 with the far plane at 100 so i'm quite certain that the issue is not the distance between the planes.Avocher

1 Answers

0
votes

After running into this problem myself (probably following the same path of copying code from all over three.js), I might have found the problem.

When reading from the RGBA-Texture in JS, you will get the color-values in Uint8-format, whereas the texture-lookups in GLSL (I suspect the calculations were copied from there) return rgba-float values.

So you need to multiply the color-vector with 1 / 255 before unpacking:

// ...

renderer.readRenderTargetPixels(
    depthRenderTarget, x, y, 1, 1, pixelBuffer);

vec4
  .fromArray(pixelBuffer)
  .multiplyScalar(1/255);

var logz = vec4.dot(unPackFactors);

// ...