4
votes

In my app, the user can take multiple images using the UIImagePickerController, and those images are then displayed one by one in the view.

I've been having some trouble with memory management. With cameras on today's phones quickly rising in megapixels, UIImages returned from UIImagePickerController are memory hogs. On my iPhone 4S, the UIImages are around 5MB; I can hardly imagine what they're like on the newer and future models.

A friend of mine said that the best way to handle UIImages was to immediately save them to a JPEG file in my app's document directory and to release the original UIImage as soon as possible. So this is what I've been trying to do. Unfortunately, even after saving the UIImage to a JPEG and leaving no references to it in my code, it is not being garbage collected.

Here are the relevant sections of my code. I am using ARC.

// Entry point: UIImagePickerController delegate method
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {

    // Process the image.  The method returns a pathname.
    NSString* path = [self processImage:[info objectForKey:UIImagePickerControllerOriginalImage]];

    // Add the image to the view
    [self addImage:path];
}

-(NSString*) processImage:(UIImage*)image {

    // Get a file path
    NSArray* paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString* documentsDirectory = [paths objectAtIndex:0];
    NSString* filename = [self makeImageFilename]; // implementation omitted
    NSString* imagePath = [documentsDirectory stringByAppendingPathComponent:filename];

    // Get the image data (blocking; around 1 second)
    NSData* imageData = UIImageJPEGRepresentation(image, 0.1);

    // Write the data to a file
    [imageData writeToFile:imagePath atomically:YES];

    // Upload the image (non-blocking)
    [self uploadImage:imageData withFilename:filename];

    return imagePath;
}

-(void) uploadImage:(NSData*)imageData withFilename:(NSString*)filename {
    // this sends the upload job (implementation omitted) to a thread
    // pool, which in this case is managed by PhoneGap
    [self.commandDelegate runInBackground:^{
        [self doUploadImage:imageData withFilename:filename];
    }];
}

-(void) addImage:(NSString*)path {
    // implementation omitted: make a UIImageView (set bounds, etc).  Save it
    // in the variable iv.

    iv.image = [UIImage imageWithContentsOfFile:path];
    [iv setNeedsDisplay];
    NSLog(@"Displaying image named %@", path);
    self.imageCount++;
}

Notice how the processImage method takes a reference to a UIImage, but it uses it for only one thing: making the NSData* representation of that image. So, after the processImage method is complete, the UIImage should be released from memory, right?

What can I do to reduce the memory usage of my app?

Update

I now realize that a screenshot of the allocations profiler would be helpful for explaining this question.

Allocations of app

1
How large of an image do you need for the UIImageView? Based on the code above, the image is saved as a jpeg (saving disk space) and is then reloaded in the addImage method. When it is loaded again, the image takes as much as it did before saving it as a jpeg. If possible, rescale to the appropriate size before saving it to disk.bobnoble
It's true that I'm re-loading the image for the UIImageView, but the JPEG representation is only about 10-15% the size of the original, so I should still see a reduction in memory, right? According to Instruments, the memory usage never dips after it rises.sffc
@vote539 - although the compressed JPEG is only 10-15% in size, when you reload it for display in an imageView it will need to be uncompressed and the memory requirement will be the same as for any other image. As a rule of thumb, you can estimate it as pixels x colours x bit-depth per colour.foundry

1 Answers

7
votes

Your processImage method is not your problem.

We can test your image-saving code by transplanting it into Apple's PhotoPicker demo app

Conveniently, Apple's sample project is very similar to yours, with a method to take repeated pictures on a timer. In the sample, the images are not saved to the filesystem, but accumulated in memory. It comes with this warning:

/* Start the timer to take a photo every 1.5 seconds.
CAUTION: for the purpose of this sample, we will continue to take pictures indefinitely.
Be aware we will run out of memory quickly. You must decide the proper threshold number of photos allowed to take from the camera.
One solution to avoid memory constraints is to save each taken photo to disk rather than keeping all of them in memory.
In low memory situations sometimes our "didReceiveMemoryWarning" method will be called in which case we can recover some memory and keep the app running.
*/

With your method added to Apple's code, we can address this issue.

The imagePicker delegate method looks like this:

- (void)imagePickerController:(UIImagePickerController *)picker 
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    UIImage *image = [info valueForKey:UIImagePickerControllerOriginalImage];

    [self.capturedImages removeAllObjects];   // (1)
    [self.imagePaths addObject:[self processImage:image]]; //(2)

    [self.capturedImages addObject:image];

    if ([self.cameraTimer isValid])
    {
        return;
    }
    [self finishAndUpdate]; //(3)
}

(1) - our addition, to flush the live memory on each image capture event
(2) - our addition, to save image to filesystem and build a list of filesystem paths.
(3) - for our tests we are using the cameraTimer to take repeat images, so finishAndUpdate does not get called.

I have used your processImage: method as is, with the line:
[self uploadImage:imageData withFilename:filename];
commented out.

I have also added a small makeImageFileName method:

static int imageName = 0;

-(NSString*)makeImageFilename {
    imageName++;
    return [NSString stringWithFormat:@"%d.jpg",imageName];
}

These are the only additions I have made to Apple's code.

Here is the memory footprint of Apple's original code (cameraTimer run without (1) and (2))

enter image description here

Memory climbed to ~140MB after capture of ~40 images

Here is the memory footprint with the additions (cameraTimer run with (1) and (2))

enter image description here

The filesaving method is fixing the memory issue: memory is flat with spikes of ~30MB per image capture.

These test were run on an iPhone5S. Uncompressed images are 3264 x 2448 px, which should be around 24mB (24-bit RGB). Jpeg compressed (filesystem) size ranges between 250kB (0.1 quality, as per your code) to 1-2mB (0.7 quality) upto ~6mB (1.0 quality).

In a comment to your question, you suggest that a re-loaded image will benefit from that compression. This is not the case: when an image is loaded into memory it must first be uncompressed. It's memory footprint will be approximately equal to pixels x colours x bit-depth per colour - regardless of the way the image is stored on disk. As jrturton has pointed out, this at least suggests that you should avoid loading an image at greater resolution than you need for display. Say you have a full-screen (retina) imageView of 832 x 640, you are wasting memory loading an image larger than that if your user cannot zoom in. That's a live memory footprint of ~1.6mB, a huge improvement on your 24mMB original (but this is a digression from your main issue).

As processImage doesn't seem to be the cause of your memory trouble, you should look at other possibilities:

1/ You don't have a memory issue. How are you profiling the app?
2/ One of addImage or uploadImage is retaining memory. Try commenting each out in turn to identify which.
3/ The problem is elsewhere (something managed by PhoneGap?)

As regards those memory spikes, these are caused by the image-to-data jpeg compression line:
NSData* imageData = UIImageJPEGRepresentation(image, 0.1);

Under the hood, that is ImageIO, and it is probably unavoidable when using ImagePickerController. See here: Most memory efficient way to save a photo to disk on iPhone? If you switch to AVFoundation you can get at the image as unconverted NSData so you could avoid the spike.