1
votes

When you create a brush in a 3D map editor such as Valve's Hammer Editor, textures of the object are by default repeated and aligned to the world coordinates.

How can i implement this functionality using OpenGL?

Can glTexGen be used to achieve this?

Or do i have to use texture matrix somehow?

If i create 3x3 box, then it's easy: Set GL_TEXTURE_WRAP_T to GL_REPEAT Set texturecoords to 3,3 at the edges.

But if the object is not an axis aligned convexhull it gets, uh, a bit complicated.

Basically i want to create functionality of face edit sheet from Valve Hammer:

Valve Hammer Face Edit tool

1

1 Answers

2
votes

Technically you can use texture coordinate generation for this. But I recommend using a vertex shader that generates the texture coordinates from the transformed vertex coordinates. Be a bit more specific (I don't know Hammer very well).


After seeing the video I understand your confusion. I think you should know, that Hammer/Source probably don't have the drawing API generate texture coordinates, but produce them internally.

So what you can see there are textures that are projected on either the X, Y or Z plane, depending in which major direction the face is pointing to. It then uses the local vertex coordinates as texture coordinates.

You can implement this in the code loading a model to a Vertex Buffer Object (more efficient, since the computation is done only once), or in a GLSL vertex shader. I'll give you the pseudocode:

cross(v1, v2):
    return { x = v1.y * v2.z - v1.z * v2.y,
             y = v2.x * v1.z - v2.z * v1.x, // <- "swapped" order!
             z = v1.x * v2.y - v1.y * v2.x }

normal(face): 
    return cross(face.position[1] - face.position[0], face.position[2] - face.position[0])

foreach face in geometry:
    n = normal(face) // you'd normally precompute the normals and store them
    if abs(n.x) > max(abs(n.y), abs(n.z)): // X major axis, project to YZ plane
        foreach (i, pos) in enumerate(face.position):
             face.texcoord[i] = { s = pos.y, t = pos.z }

    if abs(n.y) > max(abs(n.x), abs(n.z)): // Y major axis, project to XZ plane
        foreach (i, pos) in enumerate(face.position):
             face.texcoord[i] = { s = pos.x, t = pos.z }

    if abs(n.z) > max(abs(n.y), abs(n.x)): // Z major axis, project to XY plane
        foreach (i, pos) in enumerate(face.position):
             face.texcoord[i] = { s = pos.x, t = pos.y }

To make this work with glTexGen texture coordinate generation, you'd have to split your models into parts of each major axis. What glTexGen does is just the mapping step face.texcoord[i] = { s = pos.<>, t = pos.<> }. In a vertex shader you can do the branching directly.