2
votes

I'm trying to load a texture in OpenGL ES under iOS. This is the approach I'm taking:

  1. parsing an obj, from which I get position, normal and texcoord attributes.
  2. enabling (for this test) only position and texcoords to the shaders.
  3. converting an arbitrary 512x512 png to raw and loading that into GL, through sampler 0.
  4. in the vertex shader, merely returning gl_Position in canonical volume / clip space while passing the texcoords to the fragment shader.
  5. in the fragment shader, assigning to gl_FragColor the lookup from texture2d.

The texture file: t1.png: enter image description here

The render result:

enter image description here

Clearly no texture mapped onto the cube. Though I'm getting some value out of texture2D() cause as you'll see, there's not lighting or color being passed or processed in the shaders... My guess is I'm somehow messing up how I think I'm converting the png to a raw buffer to pass in to the texture memory.

This is the obj file. A test cube:

# Blender v2.63 (sub 0) OBJ File: ''
# www.blender.org
mtllib cube2.mtl
o Cube
v 1.000000 -1.000000 -1.000000
v 1.000000 -1.000000 1.000000
v -1.000000 -1.000000 1.000000
v -1.000000 -1.000000 -1.000000
v 1.000000 1.000000 -0.999999
v 0.999999 1.000000 1.000001
v -1.000000 1.000000 1.000000
v -1.000000 1.000000 -1.000000
vt 0.000000 0.000000
vt 1.000000 0.000000
vt 1.000000 1.000000
vt 0.000000 1.000000
vn 0.000000 -1.000000 0.000000
vn 0.000000 1.000000 0.000000
vn 1.000000 0.000000 0.000000
vn -0.000000 -0.000000 1.000000
vn -1.000000 -0.000000 -0.000000
vn 0.000000 0.000000 -1.000000
usemtl Material
s off
f 1/1/1 2/2/1 3/3/1
f 1/1/1 3/3/1 4/4/1
f 5/1/2 8/2/2 7/3/2
f 5/1/2 7/3/2 6/4/2
f 1/1/3 5/2/3 6/3/3
f 1/1/3 6/3/3 2/4/3
f 2/1/4 6/2/4 7/3/4
f 2/1/4 7/3/4 3/4/4
f 3/1/5 7/2/5 8/3/5
f 3/1/5 8/3/5 4/4/5
f 5/1/6 1/2/6 4/3/6
f 5/1/6 4/3/6 8/4/6

Then there's code which parses the obj and creates a data structure - the normals are being correctly parsed. Then, the relevant code snippets are:

Make sure position and texcoords are bind during linking:

        // Attach vertex shader to program.
        glAttachShader(_program, vertShader);

        // Attach fragment shader to program.
        glAttachShader(_program, fragShader);

        // Bind attribute locations.
        glBindAttribLocation(_program, 0, "position");
        glBindAttribLocation(_program, 1, "tCoordinates");

The vertex shader (just passes the texcoords to the fragment shader):

uniform mat4 modelViewProjectionMatrix;
uniform mat3 normalMatrix;

attribute vec4 position;
attribute vec2 tCoordinates;

varying highp vec2 tCoordinatesVarying;

void main()
{
    tCoordinatesVarying = tCoordinates;
    gl_Position = modelViewProjectionMatrix * position;
}

The fragment shader, which "Should" be getting a vec4 from the loaded texture and merely passing that as the frag color:

uniform sampler2D s_texture;

varying highp vec2 tCoordinatesVarying;

void main()
{
    gl_FragColor = texture2D(s_texture, tCoordinatesVarying);
}

And the setup code, pre-drawing, in the app which sets the vertex attributes and the texture and texture unit / sampler binding: (Ignore my not releasing of CG objects for now)

 NSString* path = [[NSBundle mainBundle] pathForResource:@"t1.png" ofType:nil];
    NSData* texData = [NSData dataWithContentsOfFile:path];
    UIImage* image = [UIImage imageWithData:texData];
    GLuint width = CGImageGetWidth(image.CGImage);
    GLuint height = CGImageGetHeight(image.CGImage);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    void* imageData = malloc(height*width*4);
    CGContextRef context = CGBitmapContextCreate(imageData, width, height, 8, 4*width, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Host);
    CGContextClearRect(context, CGRectMake(0, 0, width, height));
    CGContextDrawImage(context, CGRectMake(0, 0, width, height), image.CGImage);
    CGContextRelease(context);

    [EAGLContext setCurrentContext:self.context];
    [self loadShaders];

    glEnable(GL_DEPTH_TEST);
    glPixelStorei(GL_UNPACK_ALIGNMENT, 4);

    glActiveTexture(GL_TEXTURE0);

    glGenTextures(1, &_textureId);
    glBindTexture(GL_TEXTURE_2D, _textureId);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, imageData);

    int loc = glGetUniformLocation(_program, "s_texture");
    glUniform1i(loc, 0);

    glGenVertexArraysOES(1, &_vertexArray);
    glBindVertexArrayOES(_vertexArray);

    glGenBuffers(1, &_vertexBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, vboData->vertexAttributesSize, vboData->vertexAttributes, GL_STATIC_DRAW);
    glEnableVertexAttribArray(GLKVertexAttribPosition);
    glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, vboData->vertexAttributesStride,vboData->vPositionOffset);
    glEnableVertexAttribArray(GLKVertexAttribTexCoord0);
    glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, vboData->vertexAttributesStride,vboData->vTCoordinatesOffset);
1
Try outputting your texture coordinates directly to glFragColor, and see what you getzacaj
I'm not sure I'm following you, the texcoords are vec2 and the fragColor is vec4. You mean just whatever mapping to make sure at least the coordinates are really being output? For instance, gl_FragColor = vec4(tCoordinatesVarying, 0, 1) so that if I get nothing or the same color all throughout the fragments it'd mean tCoordinatesVarying is always 0?SaldaVonSchwartz
Exactly. You should have red in one corner, green in the opposite and blending between them in the middle if you use that code.zacaj
YEP... you were right. Thanks! From your test, I realized that I still had glEnableVertexAttribArray(GLKVertexAttribTexCoord0); and clearly GLKVertexAttribTexCoord0 doesn't map to 1, which was the location I had manually set for the texcoord attribute. By instead doing glEnableVertexAttribArray(1); I started getting the texcoords inside the shader.SaldaVonSchwartz
Haha I have no idea. Stack overflow isn't really designed for troubleshooting...zacaj

1 Answers

0
votes

From zacaj's suggestion, I realized I just wasn't getting any texcoords into the shader. The bug was:

glEnableVertexAttribArray(GLKVertexAttribTexCoord0);
glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, vboData->vertexAttributesStride,vboData->vTCoordinatesOffset);

So, I was enabling the attribute number referenced through GLKVertexAttribTexCoord0 but binding the attribute in attribute 1. Cleary, the GLK constant didn't translate to 1.

By instead doing

glEnableVertexAttribArray(1);
glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, vboData->vertexAttributesStride,vboData->vTCoordinatesOffset);

I started getting texcoords in the shader.