I am writing a raytracer in java, and I was able to get tracing of spheres working, but I believe I have something wrong with how I am tracing triangles.
Here is the basic algorithm, as I understand it:
- First determine if the ray even intersects the plane that the triangle is on.
- Clip all points so they are on the same plane as the triangle (so to the
xy
plane, as an example). - Determine if the potential intersection point falls inside or out of the triangle, based on the number of polygon edges you cross when sending out a ray in an arbitrary direction along the new plane.
Now, here is my implementation of that (specifically the first point):
public Vector getIntersectionVector(Ray ray)
{
Vector planeIntersectionVector = getPlaneIntersectionVector(ray, getPlaneNormal());
if (planeIntersectionVector != null)
{
if (isIntersectionVectorInsideTriangle(planeIntersectionVector))
{
return planeIntersectionVector;
}
else
{
return null;
}
}
else
{
return null;
}
}
Where getPlaceIntersectionVector()
is:
private Vector getPlaneIntersectionVector(Ray ray, Vector planeNormal)
{
double vd = planeNormal.dotProduct(ray.getDirection());
//(p_n \dot r_d) == 0, parallel. (p_n \dot r_d) > 0 Plane normal pointing away from ray.
if (vd >= 0)
{
return null;
}
double distance = planeNormal.distance(0d, 0d, 0d);
double vo = -(planeNormal.dotProduct(ray.getOrigin()) + distance);
double intersectionDistance = vo / vd;
//intersectionDistance <= 0 means the "intersection" is behind the ray, so not a real intersection
return (intersectionDistance <= 0) ? null : ray.getLocation(intersectionDistance);
}
Which basically tries to mimic this:
Where t
is the distance along the ray that the point hits, ro is the origin of the ray, rd is the direction of the ray, pn refers to the plane normal of the triangle/plane, and d
is the distance from the plane that the triangle is on to the origin (0,0,0)
Am I doing that wrong? When I send out the ray from the first pixel in the image (0,0)
, I am seeing that the intersectionDistance
(or t
) is almost 1100
, which intuitively seems wrong to me. I would think that the intersection point would be much closer.
Here is the relevant data:
Ray origin (0,0,1)
, Ray Direction is roughly (0.000917, -0.4689, -0.8833)
.
Triangle has vertices as (-0.2, 0.1, 0.1)
, (-0.2, -0.5, 0.2)
, (-0.2, 0.1, -0.3)
, which makes the plane normal (-1, 0, 0)
.
According to my code, the Ray intersects the plane 1090
distance away, which as I mentioned before, seems wrong to me. The scene is only -1.0 to 1.0 in every direction, which means the intersection is very very far in the distance.
Am I doing the plane intersection wrong?
Please let me know where to clarify points, and if you need any more information.