I've been given some example code that I need to fix, and one function that I need to create will be used for triangle ray-tracing. I've currently got the code set up like this.
/*RayHitResult is a struct that determines if the ray has reached a polygon*/
RayHitResult TriangleIntersect(Ray& ray)
{
RayHitResult result = Ray::s_defaultHitResult;
double t = 10000; /*This is an extreme value for testing*/
Vector3 point;
Vector3 edge1 = m_vertices[1] - m_vertices[0]; /*m_vertices[0], [1] and [2] are the vertices of the triangle*/
Vector3 edge2 = m_vertices[2] - m_vertices[0];
Vector3 t_normal = edge1.CrossProduct(edge2); /*This is the normal*/
double d = -(m_vertices[0].DotProduct(t_normal)); /*This is the offset*/
t = -(ray.GetRayStart().DotProduct(t_normal) + d) / (ray.GetRay().DotProduct(t_normal)); /*The variable t is what I need to calculate*/
point = ray.GetRayStart() + ray.GetRay()*t;
/*This statement alters the result depending on whether there was an intersection*/
if (t > 0 && t < 10000)
{
result.t = t;
result.normal = this->m_normal;
result.point = point;
result.data = this;
return result;
}
return result;
}
I need to calculate the value of the variable "t" in order to determine whether the ray has intersected with a polygon. However, the current equation used to calculate "t" is incorrect as it does not produce a specified result. I've been told that the formula I'm using is correct, so can anyone tell me why the current "t" equation doesn't work?
n is the normal, d is the offset, S is the ray origin and v is the ray itself.