1
votes

So im testing my shader(vertex+fragment+ on Apple's OpenGL Shader Builder (by simply just importing my shader files) and even though it seems to be working fine so far it doesn't seem to work when i try to use them in my OpenGL Application. Im going for the simplest form of ambient+diffuse lighting.

Edit:: Im importing data from this mesh (obj wavefront) -> http://pastebin.com/cDyjHHNn

Here is a capture of the problem: (teapot on OpenGL Shader Builder, cube on OpenGL Application)

enter image description here

My Vertex shader:

#version 120

vec4 ambien = vec4(0.2,0.2,0.2,1.0);
vec4 diffuz = vec4(1.0,1.0,1.0,1.0);

//Fixed Light source position.
vec3 LS = vec3(0.0, 15.0, 5.0);

void main()
{

        gl_Position = gl_ProjectionMatrix * gl_ModelViewMatrix * gl_Vertex;

        vec4 LS2;
        //LS2 = gl_ModelViewProjectionMatrix * vec4(LS,1.0);
        vec3 normal, lightDir;
        vec4 diffuse, ambient, globalAmbient;
        float NdotL;

        normal = normalize(gl_NormalMatrix * gl_Normal);
        lightDir = normalize(LS.xyz);
        NdotL = max(dot(normal, lightDir), 0.0);
        diffuse = diffuz * diffuz;

        ambient = ambien * ambien;
        globalAmbient = ambien * ambien;

        gl_FrontColor =  NdotL * diffuse + globalAmbient + ambient;


    } 

My Fragment Shader:

#version 120

void main(void)
{

    gl_FragColor = gl_Color;
}

And my OpenGL application drawing routine:

- (void) drawFrameWithObject
{

    [[self openGLContext] makeCurrentContext];
    [sM useProgram];


    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glEnable(GL_DEPTH_TEST);

    glClearColor(1.0f,1.0f,1.0f,1.0f);

    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();

    float aspect = (float)1024 / (float)576;
    glOrtho(-aspect, aspect, -1, 1, -1, 1);

    glMatrixMode(GL_MODELVIEW);
    glLoadIdentity();
    glTranslatef(0.0f, 0.0f, 0.0f);
    glScalef(0.3f, 0.3f, 0.3f);

    glRotatef(self.rotValue, 1.0f, 1.0f, 0.0f);


    glEnableClientState(GL_VERTEX_ARRAY);  
    //glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);

    glBindBuffer(GL_ARRAY_BUFFER, vBo[0]);
    glVertexPointer(3, GL_FLOAT, 0, 0);

    glBindBuffer(GL_ARRAY_BUFFER, vBo[1]);
    glNormalPointer(GL_FLOAT, 0, 0);

    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vBo[2]);
    glDrawElements(GL_TRIANGLES, object.indicesCount, GL_UNSIGNED_INT, 0);

    glBindBuffer(GL_ARRAY_BUFFER, 0);
    glDisableClientState(GL_VERTEX_ARRAY);

    [[self openGLContext] flushBuffer];
}

Bottom line, since my shader is working in OpenGL Shader Builder i think i can safely assume the problem is not in my shaders. Could be the transformations in my application? Some missing opengl state/setting/attribute im missing? I've experimented quite a lot not sure whats wrong.

1
What does your geometry data look like? Seems like your cube just has the wrong normals (looks like all vertices have the same normal).Christian Rau
pastebin.com/cDyjHHNn My .obj (wavefront) mesh. Ill edit and place it in my question too.apoiat
Then the next question, does your OBJ loading work correctly? C'mon, a working OBJ file doesn't need to make this immune to any errors. Wait, I already found the problem.Christian Rau

1 Answers

2
votes

You set the glNormalPointer to your supposed normal data, but you forgot to enable the normal array at all, using glEnableClientState(GL_NORMAL_ARRAY) (and don't forget to disable it again after drawing), like you did with the position array. So all your vertices have the same default normal, which fits exactly to the effect you observe.