My scene consists of a plane which I'm shading with two textures. The first, bottom-most, texture is a solid image coming from the camera of the iPhone, the second image is a kind of viewfinder which I need to overlay over the camera input (with transparency). I'm getting these black dark lines at the borders of solids in my transparent texture.
I've been doing some research on this and came to the understanding that this artifact is a result of interpolation combined with premultiplied alpha. Since XCode converts all PNG images automatically to premultiplied pngs and the code I wrote to load the image is also respecting a premultiplied alpha context I'm kinda stuck on where the exact problem is located.
I've tried the following solutions:
- made sure that pixels with an alpha value of 0 had their rgb values also set to 0
- turned off xcode's automatic png compression so that it leaves the texture as provided by me
- fiddled around with the fragment shader to avoid the mix() call
- changed the AlphaInfo value when creating the bitmap context
Important: I'm not using glBlendFunc(), I'm feeding the two textures together to 1 fragment shader and try to mix them in there. So a solution through this gl-call won't get me any further.
Here's the code I'm using for loading the transparent texture:
shared_ptr<ImageData> IOSFileSystem::loadImageFile(string path, bool flip) const
{
cout << path << endl;
// Result
shared_ptr<ImageData> result = shared_ptr<ImageData>();
// Convert cpp string to nsstring
NSString *convertedPathString = [NSString stringWithCString:path.c_str() encoding:[NSString defaultCStringEncoding]];
NSString *fullPath = [NSString stringWithFormat:@"%@%@", [[NSBundle mainBundle] resourcePath], convertedPathString];
// Check if file exists
if([[NSFileManager defaultManager] fileExistsAtPath:fullPath isDirectory:NO])
{
// Load image
UIImage *image = [[[UIImage alloc] initWithContentsOfFile:fullPath] autorelease];
CGImageRef imageRef = image.CGImage;
// Allocate memory for the image
size_t width = CGImageGetWidth(imageRef);
size_t height = CGImageGetHeight(imageRef);
GLubyte *spriteData = (GLubyte*) calloc(width * height * 4, sizeof(GLubyte));
// Create drawing context
CGContextRef context = CGBitmapContextCreate(spriteData, width, height, 8, width * 4, CGImageGetColorSpace(imageRef), kCGImageAlphaPremultipliedLast);
// Flip for OpenGL coord system
if(flip)
{
CGContextTranslateCTM(context, 0, image.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
}
// Draw & release
CGContextDrawImage(context, CGRectMake(0.0, 0.0, width, height), imageRef);
CGContextRelease(context);
// Put result in shared ptr
// Don't free() the spritedata because our shared pointer will take care of that
// Since the shared pointer doesn't know how to free "calloc" data, we have to teach it how: &std::free
shared_ptr<GLubyte> spriteDataPtr = shared_ptr<GLubyte>(spriteData, &std::free);
result = shared_ptr<ImageData>(new ImageData(path, width, height, spriteDataPtr));
}
else
{
cout << "IOSFileSystem::loadImageFile -> File does not exist at path.\nPath: " + path;
exit(1);
}
return result;
}
Here's how I set the pixels on the texture:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0,
GL_RGBA, GL_UNSIGNED_BYTE, pixels);
Here's a stripped down version of the fragment shader:
void main(void)
{
lowp vec4 camera = texture2D(texture0, destinationTexCoord);
lowp vec4 viewfinder = texture2D(texture1, destinationTexCoord);
lowp vec4 result = mix(camera, viewfinder, viewfinder.a);
gl_FragColor = result;
}
compress PNGs
build setting to NO – eric.mitchell