0
votes

I have a partially-transparent UIImage that I would like to convert to a JPEG.

NSData * output = UIImageJPEGRepresentation(myUIImage,.90);

The JPEG always has a white background. I would like it to be black. How can I do that?

Performance is a concern. The image has just been rendered in CoreImage, where it would also be possible to set a background Color.

CIFilter *filter = [CIFilter filterWithName:@"CIPixellate"];
[filter setDefaults];
[filter setValue:[CIImage imageWithCGImage:editImage.CGImage] forKey:kCIInputImageKey];
[filter setValue:@(amount) forKey:@"inputScale"];
[filter setValue:vector forKey:@"inputCenter"];
CIImage* result = [filter valueForKey:kCIOutputImageKey];

Currently I immediately re-render 'result' into a new UIGraphicsImageContext

    CGContextSetFillColorWithColor(ref, backgroundFillColor.CGColor);
    CGRect drawRect = (CGRect){{0,0},editImage.size};
    CGContextFillRect(ref, drawRect);
    CGContextDrawImage(ref, drawRect, cgImage);
    UIImage* filledImage = UIGraphicsGetImageFromCurrentImageContext();

but that adds up to 82% more execution time vs skipping the step and having a white JPEG background.

I'd so appreciate help on this. Thank you.

Update: I tried the following with CISourceOverCompositing, which increased runtime by 198% in some cases

CIFilter * constantColorFilter = [CIFilter filterWithName:@"CIConstantColorGenerator"];
[constantColorFilter setValue:[CIColor colorWithCGColor:[backgroundFillColor CGColor]] forKey:kCIInputColorKey];

CIFilter * composeFilter = [CIFilter filterWithName:@"CISourceOverCompositing"];
CIImage * bgColorResult = [constantColorFilter valueForKey:kCIOutputImageKey];
[composeFilter setValue:bgColorResult forKey:kCIInputBackgroundImageKey];
[composeFilter setValue:pixelateResult forKey:kCIInputImageKey];
result = [composeFilter valueForKey:kCIOutputImageKey];

I tried using singletons CIFilters to avoid re-creating CIFilter objects, but it had trivial impact on performance.

1

1 Answers

0
votes

I get sub-optimal performance (80% runtime increase), but this is what I use now:

CIFilter *pixelateFilter = [CIFilter filterWithName:@"CIPixellate"];
[pixelateFilter setDefaults];
[pixelateFilter setValue:[CIImage imageWithCGImage:editImage.CGImage] forKey:kCIInputImageKey];
[pixelateFilter setValue:@(amount) forKey:@"inputScale"];
[pixelateFilter setValue:vector forKey:@"inputCenter"];
CIImage* result = [pixelateFilter valueForKey:kCIOutputImageKey];    
CIContext *context = [CIContext contextWithOptions:nil];
CGRect extent = [pixelateResult extent];
CGImageRef cgImage = [context createCGImage:result fromRect:extent];

UIGraphicsBeginImageContextWithOptions(editImage.size, YES, [editImage scale]);
CGContextRef ref = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(ref, 0, editImage.size.height);
CGContextScaleCTM(ref, 1.0, -1.0);

CGContextSetFillColorWithColor(ref, backgroundFillColor.CGColor);
CGRect drawRect = (CGRect){{0,0},editImage.size};
CGContextFillRect(ref, drawRect);
CGContextDrawImage(ref, drawRect, cgImage);
UIImage* filledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
returnImage = filledImage;

CGImageRelease(cgImage);