3
votes

I want to have a DOM node track a particle in my THREE.js simulation. My simulation is built with the Points object, using a bufferGeometry. I'm setting the positions of each vertex in the render loop. Over the course of the simulation I'm moving / rotating both the camera and the Points object (through its parent Object3d).

I can't figure out how to get reliable screen coordinates for any of my particles. I've followed the instructions on other questions, like Three.JS: Get position of rotated object, and Converting World coordinates to Screen coordinates in Three.js using Projection, but none of them seem to work for me. At this point I can see that the calculated projections of the vertices are changing with my camera movements and object rotations, but not in a way that I can actually map to the screen. Also, sometimes two particles that neighbor each other on the screen will yield wildly different projected positions.

Here's my latest attempt:

const { x, y, z } = layout.getNodePosition(nodes[nodeHoverTarget].id)

var m = camera.matrixWorldInverse.clone()

var mw = points.matrixWorld.clone()

var p = camera.projectionMatrix.clone()

var modelViewMatrix = m.multiply(mw)
var position = new THREE.Vector3(x, y, z)
var projectedPosition = position.applyMatrix4(p.multiply(modelViewMatrix))
console.log(projectedPosition)

Essentially I've replicated the operations in my shader to derive gl_Position.

projectedPosition is where I'd like to store the screen coordinates.

I'm sorry if I've missed something obvious... I've tried a lot of things but so far nothing has worked :/

Thanks in advance for any help.

1
any chance you can add create a fiddle reproducing the problem?noveyak

1 Answers

7
votes

I figured it out...

var position = new THREE.Vector3(x, y, z)

var projectedPosition = position.applyMatrix4(points.matrixWorld).project(camera)