0
votes

I'm writing a very basic raycaster for a 3D scene with triangulated objects and everything worked fine until I decided to try casting rays from points other than the origin of the scene (0/0/0).

However, when I changed to origin of the ray to (0/1/0) the intersection test suddenly returned a wrong intersection point for one of the triangles.

I'm deliberately "shooting" the rays into the direction of the center of the triangle, so obviously this should be the intersection point. I just simply don't know what's exactly leading to the wrong results in my code.

(I'm not using Möller-Trumbore at the moment because I'd like to start out with a simpler, more basic approach, but I will switch to Möller-Trumbore when optimizing the code.)

These are the coordinates of my the three vertices of the above mentioned triangle:

-2.0/2.0/0.0 | 0.0/3.0/2.0 | 2.0/2.0/0.0

This is the center of the triangle:

0.0/2.3333333333333335/0.6666666666666666

This is my ray (origin + t * Direction):

Origin: 0.0/1.0/0.0

Direction (normalized): 0.0/0.894427190999916/0.4472135954999579

This is the obviously wrong intersection point my program calculated (before checking and finding out that the point is not even on the triangle:

0.0/5.0/1.9999999999999996

So yeah, it's not hard to see (even without a calculator) that the ray should hit the triangle at its center at roughly t = 1.5. My code, however, returns the value 4.472135954999579 for t.

Here's my code for the intersection check:

    public Vector intersectsWithTriangle(Ray ray, Triangle triangle) {
    boolean intersects = false;

    Vector triangleNormal = triangle.getNormalVector();
    double normalDotRayDirection = triangleNormal.dotProduct(ray.getDirection());

    if(Math.abs(normalDotRayDirection) == 0) {
        // parallel
        return null;
    }


    double d = triangleNormal.dotProduct(triangle.getV1AsVector());
    double t = (triangleNormal.dotProduct(ray.getOrigin()) + d) / normalDotRayDirection;

    // Check if triangle is behind ray
    if (t < 0) return null;

    // Get point of intersection between ray and triangle
    Vector intersectionPoint = ray.getPosAt(t);

    // Check if point is inside the triangle
    if(isPointInTriangle(intersectionPoint, triangle, triangleNormal)) {
        intersects = true;
        return intersectionPoint;
    }


    return null;
}

Any ideas what's wrong with the line that calculates t?

1
hmm, I think you should combine both steps into one method - see the Möller-Trumbore algorithm: en.m.wikipedia.org/wiki/…meowgoesthedog
If your code works correctly for (0, 0, 0), maybe you can just translate the triangle vertices via v - p where p is the point, and then apply the code for (0, 0, 0).Robert Dodier

1 Answers

1
votes

If the ray is given by o + t*v and the triangle plane is defined by normal vector n and point p, then we are looking for t s.t. n*(o + t*v) = n*p which gives t = (n*p - n*o)/(n*v). So you seem to have a sign error and the correct computation for t should be:

double t = (d - triangleNormal.dotProduct(ray.getOrigin())) / normalDotRayDirection;

As long as the ray origin was (0,0,0) the wrong sign did not matter.