I am creating a software based 3D renderer just to learn the concepts and mathematics. It's fun and I have a nice spinning cube atop a grid acting as a sort of floor. The grid/floor is rendered using line segments. I have a virtual camera positioned and oriented using a simple look-at transformation. The viewing plane is arbitrarily setup to be at distance n from the "eye", or at z = -n
.
Everything works fine (transform from object to world to camera space, cull, project, clip, render) except for one thing. When rendering the grid, line segment endpoints might span the viewing plane of the virtual camera. I want to render the portion that is visible so I clip to the viewing plane. The clipped endpoint is projected to the viewing plane. The projection is:
p'(x) = -n * p(x) / p(z)
p'(y) = -n * p(y) / p(z)
All potentially visible points will have p(z) ≤ -n
. The point that was clipped to the viewing plane has p(z) = -n
. Thus, I have in effect:
p'(x) = p(x)
p'(y) = p(y)
for such a point; an orthographic projection.
The values here can easily be outside the window on the viewing plane which causes the viewport transformation to send these points way out of the bounds of the OS window. The effect is that I see stray lines flying around periodically. It's horrible.
Short of just doing everything like say OpenGL does it (I'd just use OpenGL!), what am I missing?
Thanks (if you made it this far!).
Here's a screenshot that shows the anomaly. The near right corner of the grid is just out of view. The line segment that comes down towards the near left corner of the grid is behind the camera and is thus clipped. The endpoint undergoes the (wrong) orthographic projection and ends up way out in left field.
I am not doing any view frustum culling (yet). Perhaps I should?