4
votes

I am using the Three.JS library to display a point cloud in a web brower. The point cloud is generated once at start up and no further points are added or removed. But it does need to be rotated, panned and zoomed. I've gone through the tutorial about creating particles in three.js here

Using the example I can create particles that are squares or use an image of a sphere to create a texture. The image is closer to what I want, but is it possible to generate the point clouds without using the image? The sphere geometry for example.

The problem with the image is that when you have thousands of points it seems they sometimes obscure each other around the edges. From what I can gather it seems like the black region in a point's png file blocks the image immediately behind the current point. (But it is transparent to points further behind)

This obscuring of the images is the reason I would like to generate the points using shapes. I have tried replacing particles = new THREE.Geometry() with THREE.SphereGeometry(radius, segments, rings) and tried to change the vertices to spheres.

So my question is. How do I modify the example code so that it renders spheres (or points) instead of squares? Also, is a particle system the most efficient system for my particular case or should I just generate the particles and set their individual positions? As I mentioned I only generate the points once, but then rotate, zoom, pan the points. (I used the TrackBall sample code to get the mouse events working).

Thanks for your help

3
I expect that using an image will work if its background is transparent and not black. Try with this image: github.com/mrdoob/three.js/blob/master/examples/textures/…. Also, TrackBallControls rotates the camera, not the points.WestLangley

3 Answers

7
votes

I don't think rendering a point cloud with spheres is very efficient. You should be able to get away with a particle system and use a texture or a small canvas program to draw a circle.

One of the first three.js sample uses a canvas program, here are the important bits:

var PI2 = Math.PI * 2;
var program = function ( context )
{
    context.beginPath();
    context.arc( 0, 0, 1, 0, PI2, true );
    context.closePath();
    context.fill();    
};
var particle = new THREE.Particle( new THREE.ParticleCanvasMaterial( {
    color: Math.random() * 0x808008 + 0x808080,
    program: program
} ) );

three.js particles

Feel free to adapt the code for the WebGL renderer.

Another clever solution I've seen in the examples is using an encoded webm video to store the data and pass that to a GLSL shader which is rendered through a particle system in three.js three.js webgl kinect

If your point cloud comes from a Kinect, these resources might be useful:

  1. DepthCam
  2. KinectJS

George MacKeron's kinect to js solution

3
votes

When comparing my code to http://threejs.org/examples/#webgl_custom_attributes_particles3 I saw the only difference was:

vec4 outColor = texture2D( texture, gl_PointCoord );
if ( outColor.a < 0.5 ) discard;
gl_FragColor = outColor;

Added to the fragment shader, fixed this problem for me.

It wasn't z fighting because randomly, some corners would overlap distant particles. material.alphaTest = 0.5 didn't work and turning off depth writes/tests messed up the viewing order.

1
votes

The problem with the image is that when you have thousands of points it seems they sometimes obscure each other around the edges. From what I can gather it seems like the black region in a point's png file blocks the image immediately behind the current point. (But it is transparent to points further behind)

You can get rid of the transparency overlapping problem of the underlying square structure by turning

depthTest:false

The problem then is, if you are adding additional objects to the scene the depth-testing will fail and the PointCloud will be rendered in front of the other objects, ignoring the actual order. To get around that you can additionally turn off

depthWrite:false