Im working on a raytracer in C. I am trying to figure out the math for the ray-plane intersection. I have
d = ((Po-Lo) (dot) N) / (L (dot) N)
Now if I am correct...
n - the planes normal ray
Po = single point on the plane
L = the vector that represents the ray I am shooting
Lo = a point on the line
I am confused as to how this works. Does the point on the line (Lo) need to land on the plane if I am going pixel by pixel? if that is true couldn't I just represent that point with the direction vector of what ray (L) that I am casting?
I feel like I am completely over complicating this but I am utterly lost on how to get this working in my code.
EDIT:
d = a scalar in the real world domain.
So d needs to equal zero in order for the plane and the ray (at the point I'm looking at) to intersect? And can I use a direction vector coordinates to represent that point on the line>