1
votes

I am rendering a Sphere using OpenGL 3.2 (in Java w/ LWJGL3).

I already a working algorithm to generate the sphere vertices (with GL_TRIANGLE_STRIP primitive). However, I have no idea how to set texture coordinates & normals for these vertex.

    float angleA, angleB;
    float cos, sin;
    float r1, r2;
    float h1, h2;

    for (angleA = -90.0f; angleA < 90.0f; angleA += SPHERE_STEP) {
        r1 = (float) Math.cos(angleA * Math.PI / 180.0);
        r2 = (float) Math.cos((angleA + SPHERE_STEP) * Math.PI / 180.0);
        h1 = (float) Math.sin(angleA * Math.PI / 180.0);
        h2 = (float) Math.sin((angleA + SPHERE_STEP) * Math.PI / 180.0);

        for (angleB = 0.0f; angleB <= 360.0f; angleB += SPHERE_STEP) {
            cos = (float) Math.cos(angleB * Math.PI / 180.0);
            sin = -(float) Math.sin(angleB * Math.PI / 180.0);

            renderer.addVertex(r2*cos, h2, r2*sin, s1, t1, n1x, n1y, n1z);
            renderer.addVertex(r1*cos, h1, r1*sin, s2, t2, n2x, n2y, n2z);
        }
    }

My problem is that the texture coordinates s1, s2, t1 and t2 are unknown as well as the normals n1x, n1y, n2z, n2x, n2y, n2z (in the two addVertex lines). I don't know either what kind of texture I should use - I just want a ball (like a marble or a soccer ball). The folowwing image shows the way vertices are generated (I don't have 10 reputation...) : http://i.stack.imgur.com/h6B31.png

Does someone have an idea ? If your proposition is totally different, including a new algorithm but has texture coordinates & normals, It's perfect too !

2

2 Answers

0
votes

You can calculate the normal simply by normalizing the given vertex. So, if each of your vertices is positioned at:

vec3(r2*cos, h2, r2*sin)
// and
vec3(r1*cos, h1, r1*sin)

... then the corresponding normals would be:

normalize(vec3(r2*cos, h2, r2*sin))
// and 
normalize(vec3(r1*cos, h1, r1*sin))

They simply point out of the sphere, on each given point.

Texturing, however, is bit more difficult. I would suggest reading this, in order to understand this:

U = ((-Z/|X|) + 1)/2
V = ((-Y/|X|) + 1)/2
0
votes

Normals

For a sphere, the normals are the same as the positions. At least as long as the sphere is centered at the origin, and has radius 1.0, which is the case for the calculations in the posted code. If you picture the geometry, this makes intuitive sense. The surface of the sphere is orthogonal to a vector from the center to each point on the surface.

Or, if you like math, using a and b as the two angles that parameterize the sphere:

    [ cos(b) * cos(a) ]
v = [ cos(b) * sin(a) ]
    [ sin(b)          ]

we can calculate the two gradient vectors:

          [ - cos(b) * sin(a) ]
dv / da = [ cos(b) * cos(a)   ]
          [ 0                 ]

          [ - sin(b) * cos(a) ]
dv / db = [ - sin(b) * sin(a) ]
          [ cos(b)            ]

and then the normal is the cross product between the two gradient vectors:

[ cos(b) * cos(a) * cos(b)                                              ]
[ cos(b) * sin(a) * cos(b)                                              ]
[ cos(b) * sin(a) * sin(b) * sin(a) + cos(b) * cos(a) * sin(b) * cos(a) ]

  [ cos(b) * cos(a) * cos(b) ]
= [ cos(b) * sin(a) * cos(b) ]
  [ cos(b) * sin(b)          ]

= cos(b) * v

So the cross product is a multiple of v. And since v is already a normal vector, the normal is the same as v.

This means that if you use a shader specifically for spheres, you don't even need a separate vertex attribute for the normals. You pass in the position, and use the attribute values for both position and normal. You can use the attribute unchanged as the normal. For the position, you can scale it with the sphere radius, and translate it with the sphere position.

Texture Coordinates

As for texture coordinates, they are not really unique. One simple approach is that you essentially use the two angles that you already used to parameterize the sphere. Except that you scale them to a range of 0.0 to 1.0. With the nomenclature from your code, the two texture coordinates can be calculated as:

s = angleB / 360.0f;
t = (angleA + 90.0f) / 180.0f;

The downside of this is that the spacing is quite unequal. Particularly for s, the change per distance is much higher close to the poles than around the equator. This means that if you map some kind of texture that represents a material, the size of the resulting pattern will not be uniform across the sphere.

Another approach is that you use cube mapping. You can use the same pattern for all six sides of the cube map. And then, the texture coordinates used for cube mapping can again be the same as what you already used for the positions and normals. One set of attribute values for all three attributes!