5
votes

I'm writing a space exploration application. I've decided on light years being the units and have accurately modeled the distances between stars. After tinkering and a lot of arduous work (mostly learning the ropes) I have got the camera working correctly from the point of view of a starship traversing through the cosmos.

Initially I paid no attention to the zNear parameter of gluPerspective () until I worked on planetary objects. Since my scale is in light year units I soon realized that due to zNear being 1.0f I would not be able to see such objects. After experimentation I arrived at these figures:

#define POV 45
#define zNear 0.0000001f
#define zFar 100000000.0f
gluPerspective (POV, WinWidth/WinHeight, zNear ,zFar);

This works exceptionally well in that I was able to cruise my solar system (position 0,0,0) and move up close to the planets which look great lit and texture mapped. However other systems (not at position 0,0,0) were much harder to cruise through because the objects moved away from the camera in unusual ways.

I had noticed however that strange visual glitches started to take place when cruising through the universe. Objects behind me would 'wrap around' and show ahead, if I swing 180 degrees in the Y direction they'll also appear in their original place. So when warping through space, most the stars are correctly parallaxing but some appear and travel in the opposite direction (which is disturbing to say the least).

By changing the zNear to 0.1f immediately corrects ALL of these glitches (but also won't resolve solar system objects). So I'm stuck. I've also tried working with glFrustum and it produces exactly the same results.

I use the following to view the world:

glTranslatef(pos_x, pos_y, pos_z);

With relevant camera code to orientate as required. Even disabling camera functionality does not change anything. I've even tried gluLookAt() and again it produces the same results.

Does gluPerspective() have limits when extreme zNear / zFar values are used? I tried to reduce the range but to no avail. I even changed my world units from light years to kilometers by scaling everything up and using a bigger zNear value - nothing. HELP!

3
"had noticed however that strange visual glitches" what were your camera coordinates in the world when that happened? i suspect that these problems are caused by the precision of float's. you might be able to fix this (slightly) by scaling your world smaller, but it wont fix it perfectly.Rookie
Well one obvious glitch occured when crossing the Z axis from positive to negative. I couldn't see why this would happen if the code worked before until the zNear dropped down. So I thought a quick-fix would be offset all coordinates so that they remain positive. That didn't help so I assume the crossing mentioned was just a coincidence. It happens in other places. I'll get specifics if required.Curious Coder
Your next problem will be that while float can represent enormous distances, it has very little resolution out there. It's a binary version of scientific notation with a fixed number of significant digits. If you have a star 4.0LY away and a planet 1.5e-5LY away, the location of the planet is 4.0-1.5e-5 = 4.0LY away, oops.Ben Jackson
Yeah, the others have answered it all, but I'd like to throw in that what you're writing seems really cool and I'd like to see the finished product. Keep it up!Cosine

3 Answers

5
votes

The problem is that you want to resolve too much at the same time. You want to view things on the scale of the solar system, while also having semi-galactic scale. That is simply not possible. Not with a real-time renderer.

There is only so much floating-point precision to go around. And with your zNear being incredibly close, you've basically destroyed your depth buffer for anything that is more than about 0.0001 away from your camera.

What you need to do is to draw things based on distance. Near objects (within a solar system's scale) are drawn with one perspective matrix, using one depth range (say, 0 to 0.8). Then more distant objects are drawn with a different perspective matrix and a different depth range (0.8 to 1). That's really the only ways you're going to make this work.

Also, you may need to compute the matrices for objects on the CPU in double-precision math, then translate them back to single-precision for OpenGL to use.

3
votes

OpenGL should not be drawing anything farther from the camera than zFar, or closer to the camera than zNear.

But for things in between, OpenGL computes a depth value that is stored in the depth buffer which it uses to tell whether one object is blocking another. Unfortunately, the depth buffer has limited precision (generally 16 or 24 bits) and according to this, roughly log2(zFar/zNear) bits of precision are lost. Thus, a zFar/zNear ratio of 10^15 (~50 bits lost) is bound to cause problems. One option would be to slightly increase zNear (if you can). Otherwise, you will need to look into Split Depth Buffers or Logarithmic Depth Buffers

1
votes

Nicol Bolas already told you one piece of the story. The other is, that you should start thinking about a structured way to store the coordinates: Store the position of each object in relation to the object that dominates it gravitatively and use apropriate units for those.

So you have stars. Distances between stars are measured in lightyears. Stars are orbited by planets. Distances within a starsystem are measured in lightminutes to lighthours. Planets are orbited by moons. Distances in a planetary system are measured in lightseconds.

To display such scales you need to render in multiple passes. The objects with their scales form a tree. First you sort the branches distant to close, then you traverse the tree depth first. For each branching level you use apropriate projection parameters so that the near→far clip planes snuggily fit the to be rendered objects. After rendering each level clear the depth buffer.