1
votes

If you've seen my last few questions you'd know by now that I've been working on terrain stuff with OpenGL. I had a failed attempt at using vertex alphas and multiple render passes to blend one texture into another in a tiled heightmap... So now instead, I've got another plan.

My WorldQuad object has 'bottom' and 'top' variables, both of type Texture. This is a JOGL-provided class that wraps an OpenGL texture; it loads it for me, and then I can call enable() for it to do a glEnable(GL_TEXTURE_2D), and bind() for it to do a bind.

It also has an alphaMap variable of my own type which stores yet another Texture; but this one is of texture internal format GL_ALPHA8 (in other words, it's a texture with only alpha values). I currently generate this programmatically to make it pixel perfect.

Then I use OpenGL "texture combining" (or texture splatting it's sometimes called) in order to combine the bottom and top textures according to the alpha map.

The thing is, it worked beautifully!!... on my work laptop. It's an IBM Thinkpad T400 with an integrated Intel chip (GMA950 or something similar). But I came home to my Dell Inspiron E1705 with a discrete NVIDIA GeForce Go 7900GS card, to find that my program only renders the quads with the 'bottom' texture. It's as if the alpha isn't applied at all. I also tried it on two desktops and another laptop here. One of the desktops crashed, another displayed the quad completely white (it's an old GeForce chip, probably doesn't support OpenGL 2 or something), and the other laptop performed the same as mine; a quad covered in the bottom texture.

I've already tried changing the alpha map format to GL_RGBA, in case my graphics card for some reason didn't like a GL_ALPHA8 texture. I have also verified that all three textures are there and can render, by drawing normal quads with each individual texture, including the alphamap. I also output the alphamap to a file, so I have basically double-checked that it is there and supposed to be working.

Below is my function to draw a quad based on all of this. I've added comments so that hopefully you can follow along. Please suggest anything that might fix the problem. I'm stumped! Why would it work on a crappy integrated chip but not my graphics card? What am I missing??

Thanks!!

EDIT: Okay, nobody can help me apparently... I'm back at work now, plugged my flash drive into my work computer, started up Eclipse and hit the run button... And it worked. Then I took the drive out, put it in my personal laptop (that I brought to work today) and did the exact same thing, and it's just a red blob. But here, I documented it. First of all, the output from glGetString(GL_EXTENSIONS), diffed between the two computers.

Next, screenshots of broken and working. Again, those were taken the exact same way: open Eclipse, press run, screenshot. That's it. I didn't touch my code and there is only one copy of my code, on a flash drive that I swapped between the two computers. They both have the same versions of the JDK.

Now one of the differences between the computers is that my work PC is running XP and my laptop is running Windows 7. Should this really make a difference though? Especially since I'm using OpenGL 2, I'd think both computers would equally support it. In fact, if you look at my diff page above, my personal laptop supports many more extensions than the work laptop - as I would expect, since it's a nvidia 7900gs card compared to intel integrated. So this seems backwards; if anything, I would expect the integrated chip to choke on texture combining, but not my graphics card!

Help!

public void drawWorldQuad(GL2 gl, float x, float z, WorldQuad q, WorldVertex tl, WorldVertex tr, WorldVertex bl, WorldVertex br) {
    WorldVertex[] verts = new WorldVertex[] {
        bl, br, tr, tl
    };
    final float[][] texCoords = new float[][] {
            {0.0f, 0.0f},
            {1.0f, 0.0f},
            {1.0f, 1.0f},
            {0.0f, 1.0f}
    };
    final float[][] coords = new float[][] {
            {x,z},
            {x+1,z},
            {x+1,z+1},
            {x,z+1}
    };

    gl.glColor4f(1.0f, 1.0f, 1.0f, 1.0f);

    /* BEGIN TEXTURE BLENDING SECTION: */

    /* TEXTURE0 is the alpha map; a texture of type GL_ALPHA8, no rgb channels */
    gl.glActiveTexture(GL2.GL_TEXTURE0);
    q.alphaMap.getTexture().bind();
    q.alphaMap.getTexture().enable();
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_TEXTURE_ENV_MODE, GL2.GL_DECAL);

    /* TEXTURE1 is the 'bottom' texture */
    gl.glActiveTexture(GL2.GL_TEXTURE1);
    q.bottom.getTexture().bind();
    q.bottom.getTexture().enable();
    // use the rgb from the bottom texture
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_TEXTURE_ENV_MODE, GL2.GL_COMBINE);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_COMBINE_RGB, GL2.GL_DECAL);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_SOURCE0_RGB, GL2.GL_TEXTURE);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_OPERAND0_RGB, GL2.GL_SRC_COLOR);
    //------------------------
    // use the alpha value from the alphaMap
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_COMBINE_ALPHA, GL2.GL_DECAL);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_SOURCE0_ALPHA, GL2.GL_PREVIOUS);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_OPERAND0_ALPHA, GL2.GL_SRC_ALPHA);

    /* TEXTURE2 is the 'top' texture */
    gl.glActiveTexture(GL2.GL_TEXTURE2);
    q.top.getTexture().bind();
    q.top.getTexture().enable();
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_TEXTURE_ENV_MODE, GL2.GL_COMBINE);
    // interpolate between texture1 and texture2's colors, using the alpha values
    // from texture1 (which were taken from the alphamap)
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_COMBINE_RGB, GL2.GL_INTERPOLATE);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_SOURCE0_RGB, GL2.GL_PREVIOUS);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_SOURCE1_RGB, GL2.GL_TEXTURE);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_SOURCE2_RGB, GL2.GL_PREVIOUS);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_OPERAND0_RGB, GL2.GL_SRC_COLOR);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_OPERAND1_RGB, GL2.GL_SRC_COLOR);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_OPERAND2_RGB, GL2.GL_SRC_ALPHA);
    //------------------------
    // interpolate the alphas (this doesn't really matter, neither of the textures
    // really have alpha values)
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_COMBINE_ALPHA, GL2.GL_INTERPOLATE);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_SOURCE0_ALPHA, GL2.GL_PREVIOUS);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_SOURCE1_ALPHA, GL2.GL_TEXTURE);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_SOURCE2_ALPHA, GL2.GL_PREVIOUS);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_OPERAND0_ALPHA, GL2.GL_SRC_ALPHA);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_OPERAND1_ALPHA, GL2.GL_SRC_ALPHA);
    gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_OPERAND2_ALPHA, GL2.GL_SRC_ALPHA);

    gl.glBegin(GL2.GL_QUADS);
    {
        for(int j=0;j<verts.length;j++) {
            // (this loop and the arrays used here are for convenience, it just draws
            // a quad; verts.length == 4)

            WorldVertex v = verts[j];

            gl.glMultiTexCoord2f(GL2.GL_TEXTURE0, texCoords[j][0], texCoords[j][3]);
            gl.glMultiTexCoord2f(GL2.GL_TEXTURE1, texCoords[j][0], texCoords[j][4]);
            gl.glMultiTexCoord2f(GL2.GL_TEXTURE2, texCoords[j][0], texCoords[j][5]);

            gl.glVertex3f(coords[j][0], (float)v.height * VERTICAL_SCALE, coords[j][6]);
        }
    }
    gl.glEnd();

}
1

1 Answers

4
votes

IT WORKS!! After hours of trying stuff, I finally got it.

After setting TEXTURE0, I changed this line:

gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_TEXTURE_ENV_MODE, GL2.GL_DECAL);

to this:

gl.glTexEnvi(GL2.GL_TEXTURE_ENV, GL2.GL_TEXTURE_ENV_MODE, GL2.GL_REPLACE);

I just changed it on a whim, since I am still not clear on the difference between DECAL and REPLACE, and it worked!