4
votes

When it comes to 3D animation, there are a lot of terms and concepts that I'm not familiar with (maybe a secondary question to append to this one: what are some good books to get familiar with the concepts?). I don't know what a "UV" is (in the context of 3D rendering) and I'm not familiar with what tools exist for mapping pixels on an image to points on a mesh.

I have the following image being produced by a 360-degree camera (it's actually the output of an HTML video element):

360-degree Panorama

I want the center of this image to be the "top" of the sphere, and any radius of the circle in this image to be an arc along the sphere from top to bottom.

Here's my starting point (copying lines of code directly from the Three.JS documentation):

var video = document.getElementById( "texture-video" );

var scene = new THREE.Scene();
var camera = new THREE.PerspectiveCamera( 75, window.innerWidth / window.innerHeight, 0.1, 1000 );

var renderer = new THREE.WebGLRenderer();
renderer.setSize( window.innerWidth, window.innerHeight );
document.body.appendChild( renderer.domElement );

var texture = new THREE.VideoTexture( video );
texture.minFilter = THREE.LinearFilter;
texture.magFilter = THREE.LinearFilter;
texture.format = THREE.RGBFormat;

var material = new THREE.MeshBasicMaterial( { map: texture } );

var geometry = new THREE.SphereGeometry(0.5, 100, 100);
var mesh = new THREE.Mesh( geometry, material );

scene.add( mesh );

camera.position.z = 1

function animate()
{
    mesh.rotation.y += 0.01;
    requestAnimationFrame( animate );
    renderer.render( scene, camera );
}
animate();

This produces the following:

Mapped to a sphere

There are a few problems:

  • The texture is rotated 90 degrees
  • The ground is distorted, although this may be fixed if the rotation is fixed?
  • Update: Upon further investigation of the sphere being produced, it's not actually rotated 90 degrees. Instead, the top of the image is the top of the sphere and the bottom of the image is the bottom of the sphere. This causes the left and right edges of the image to become the distorted "sideways ground" I saw
  • This is on the outside of the sphere. I want to project this to the inside of the sphere (and place the camera inside the sphere)

Currently if I place the camera inside the sphere I get solid black. I don't think it's a lighting issue because the Three.JS docs said that a MeshBasicMaterial didn't need lighting. I think the issue may be that the normals of all of the sphere faces point outward and I need to reverse them. I'm not sure how one would do this - but I'm pretty sure it's possible since I think this is how skyboxes work.

Doing some research I'm pretty sure I need to modify the "UV"s to fix this, I just don't know how or really what that even means...

3

3 Answers

4
votes

Working Example

I forked @manthrax's CodeSandbox.io solution and updated it with my own:

https://codesandbox.io/s/4w1njkrv9

The Solution

So after spending a day researching UV mapping to understand what it meant and how it worked, I was able to sit down and scratch out some trig to map points on a sphere to points on my stereographic image. It basically came down to the following:

  1. Use arccosine of the Y coordinate to determine the magnitude of a polar coordinate on the stereographic image
  2. Use the arctangent of the X and Z coordinates to determine the angle of the polar coordinate on the stereographic image
  3. Use x = Rcos(theta), y = Rsin(theta) to compute the rectangular coordinates on the stereographic image

If time permits I may draw a quick image in Illustrator or something to explain the math, but it's standard trigonometry

I went a step further after this, because the camera I was using only has a 240 degree vertical viewing angle - which caused the image to get slightly distorted (especially near the ground). By subtracting the vertical viewing angle from 360 and dividing by two, you get an angle from the vertical within which no mapping should occur. Because the sphere is oriented along the Y axis, this angle maps to a particular Y coordinate - above which there's data, and below which there isn't.

  1. Calculate this "minimum Y value"
  2. For all points on the sphere:
    • If the point is above the minimum Y value, scale it linearly so that the first such value is counted as "0" and the top of the sphere is still counted as "1" for mapping purposes
    • If the point is below the minimum Y value, return nothing

Weird Caveats

For some reason the code I wrote flipped the image upside down. I don't know if I messed up on my trigonometry or if I messed up on my understanding of UV maps. Whatever the case, this was trivially fixed by flipping the sphere 180 degrees after mapping

As well, I don't know how to "return nothing" in the UV map, so instead I mapped all points below the minimum Y value to the corner of the image (which was black)

With a 240-degree viewing angle the space at the bottom of the sphere with no image data was sufficiently large (on my monitor) that I could see the black circle when looking directly ahead. I didn't like the visual appearance of this, so I plugged in 270 for the vertical FOV. this leads to minor distortion around the ground, but not as bad as when using 360.

The Code

Here's the code I wrote for updating the UV maps:

// Enter the vertical FOV for the camera here
var vFov = 270; // = 240;

var material = new THREE.MeshBasicMaterial( { map: texture, side: THREE.BackSide } );
var geometry = new THREE.SphereGeometry(0.5, 200, 200);

function updateUVs()
{
    var maxY = Math.cos(Math.PI * (360 - vFov) / 180 / 2);
    var faceVertexUvs = geometry.faceVertexUvs[0];
    // The sphere consists of many FACES
    for ( var i = 0; i < faceVertexUvs.length; i++ )
    {
        // For each face...
        var uvs = faceVertexUvs[i];
        var face = geometry.faces[i];
        // A face is a triangle (three vertices)
        for ( var j = 0; j < 3; j ++ )
        {
            // For each vertex...
            // x, y, and z refer to the point on the sphere in 3d space where this vertex resides
            var x = face.vertexNormals[j].x;
            var y = face.vertexNormals[j].y;
            var z = face.vertexNormals[j].z;

            // Because our stereograph goes from 0 to 1 but our vertical field of view cuts off our Y early
            var scaledY = (((y + 1) / (maxY + 1)) * 2) - 1;

            // uvs[j].x, uvs[j].y refer to a point on the 2d texture
            if (y < maxY)
            {
                var radius = Math.acos(1 - ((scaledY / 2) + 0.5)) / Math.PI;
                var angle = Math.atan2(x, z);

                uvs[j].x = (radius * Math.cos(angle)) + 0.5;
                uvs[j].y = (radius * Math.sin(angle)) + 0.5;
            } else {
                uvs[j].x = 0;
                uvs[j].y = 0;
            }
        }
    }
    // For whatever reason my UV mapping turned everything upside down
    // Rather than fix my math, I just replaced "minY" with "maxY" and
    // rotated the sphere 180 degrees
    geometry.rotateZ(Math.PI);
    geometry.uvsNeedUpdate = true;
}
updateUVs();

var mesh = new THREE.Mesh( geometry, material );

The Results

Now if you add this mesh to a scene everything looks perfect:

enter image description here enter image description here enter image description here enter image description here

One Thing I Still Don't Understand

Right around the "hole" at the bottom of the sphere there's a multi-colored ring. It almost looks like a mirror of the sky. I don't know why this exists or how it got there. Could anyone shed light on this in the comments?

1
votes

Here is as close as I could get it in about 10 minutes of fiddling with a polar unwrapping of the uv's.

You can modify the polarUnwrap function to try and get a better mapping....

https://codesandbox.io/s/8nx75lkn28

enter image description here

You can replace the TextureLoader().loadTexture() with

//assuming you have created a HTML video element with id="video"
var video = document.getElementById( 'video' );

var texture = new THREE.VideoTexture( video );
texture.minFilter = THREE.LinearFilter;
texture.magFilter = THREE.LinearFilter;
texture.format = THREE.RGBFormat;

to get your video fed in there...

More info here:

https://threejs.org/docs/#api/textures/VideoTexture

Also this may be useful to you:

https://community.theta360.guide/t/displaying-thetas-dual-fisheye-video-with-three-js/1160

-1
votes

I think, it's would be quite difficult to modify the UVs, so that the stereographic projected image will fit. The UVs of a sphere are set to fit textures with equirectangular projection.

To transform the image from stereographic to equirectangular, you might want to use Panorama tools like PTGui or Hugin. Or you can use Photoshop (apply Filter > Distort > Polar Coordinates > polar to rectangular).

Equirectangular projection Equirectangular projection of the image (with Photoshop), resized to 2:1 aspect ratio (not necessary for texture)

If you want the texture to be inside the sphere (or normals flipped), you are able to set the material to THREE.BackSide.

var material = new THREE.MeshBasicMaterial( { map: texture, side: THREE.BackSide } );

Maybe, you have to flip the texture horizontally then: How to flip a Three.js texture horizontally