0
votes

I am developing a 3D Rendering Engine for Android. I have experienced some issues with the depth buffer. I am drawing some cubes, one big and two small ones that will fall on top of the bigger one. While rendering I have seen that obviously something with the depth buffer is wrong, as seen in this screen shot:

Depth Buffer Problem on Device

This screen shot was taken on an HTC Hero (running Android 2.3.4) with OpenGL ES 1.1 The whole application is (still) targeted at OpenGL ES 1.1. It does look the same on the Emulator.

These are the calls in my onSurfaceCreated method in the renderer:

public void onSurfaceCreated(GL10 gl, EGLConfig config) {
    Log.d(TAG, "onsurfacecreated method called");

    int[] depthbits = new int[1];
    gl.glGetIntegerv(GL_DEPTH_BITS, depthbits, 0);
    Log.d(TAG, "Depth Bits: " + depthbits[0]);

    gl.glDisable(GL_DITHER);

    gl.glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_FASTEST);

    gl.glClearColor(1, 1, 1, 1);
    gl.glClearDepthf(1f);
    gl.glEnable(GL_CULL_FACE);
    gl.glShadeModel(GL_SMOOTH);
    gl.glEnable(GL_DEPTH_TEST);

    gl.glMatrixMode(GL_PROJECTION);
    gl.glLoadMatrixf(
            GLUtil.matrix4fToFloat16(mFrustum.getProjectionMatrix()), 0);

    setLights(gl);
}

The GL Call for the depth bits returns 16 on the device and 0 on the emulator. It would've made sense if it only didn't work on the emulator since there obviously is no depth buffer present. (I've tried setting the EGLConfigChooser to true, so it would create a Config with as close to 16 bits depth buffer as possible, but that didn't work on the emulator. It wasn't necessary on the device.)

In my onDrawFrame method I make the following OpenGL Calls:

    gl.glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    gl.glClearDepthf(1);

And then for each of the cubes:

    gl.glEnableClientState(GL_VERTEX_ARRAY);
    gl.glFrontFace(GL_CW);
    gl.glVertexPointer(3, GL_FIXED, 0, mVertexBuffer);
    // gl.glColorPointer(4, GL_FIXED, 0, mColorBuffer);
    gl.glEnableClientState(GL_NORMAL_ARRAY);
    gl.glNormalPointer(GL_FIXED, 0, mNormalBuffer);
    // gl.glEnable(GL_TEXTURE_2D);
    // gl.glTexCoordPointer(2, GL_FLOAT, 0, mTexCoordsBuffer);
    gl.glDrawElements(GL_TRIANGLES, mIndexBuffer.capacity(),
            GL_UNSIGNED_SHORT, mIndexBuffer);
    gl.glDisableClientState(GL_NORMAL_ARRAY);
    gl.glDisableClientState(GL_VERTEX_ARRAY);

What am I missing? If more code is needed just ask.

Thanks for any advice!

2
Depth buffer artifacts are usually caused by projection matrix. Increase your near value or decrease far value.Piotr Praszmo
I have the near value at 5 and far at 50. Even if I put it to 10 and 20 it doesn't change anything.Chnoch

2 Answers

1
votes

I got it to work correctly now. The problem was not OpenGL. It was (as Banthar mentioned) a problem with the projection matrix. I am managing the projection matrix myself and the calculation of the final matrix was somehow corrupted (or at least not what OpenGL expects). I can't remember where I got the algorithm for my calculation, but once I changed it to the way OpenGL calculates the projection matrix (or directly call glFrustumf(...)) it worked fine.

0
votes

try enabling:

-glDepthFunc(GL_LEQUAL)
-glDepthMask( true );