1
votes

I'm developing a movie maker application which make some effects on imported videos. I'm using AVAssetWriter to code my application. Everything works very good but I have a big problem in memory. My app takes over 500 MB of the RAM in buffering process. Simply the algorithm for making a filtered video is going like this:

1- import video.

2- extract all the frames for the video as CMSampleBuffer objects.

3- convert CMSampleBuffer object to uiimage.

4- implement the filter on the uiimage.

5- convert the uiimage back to a new CMSAmpleBuffer object.

6- Append the new buffer to a writer output.

7- Finally save the new movie to PhotoGallery.

The problem is in step5 I have a function which converts a UIImage to cvpixelBuffer object and return it. Then I convert the CVPixelBuffer object to CMSampleBuffer. The function increases the memory a lot and the application crashes at the end.

This is my code:

-(CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image andSize:(CGSize)size
{


    double height = CGImageGetHeight(image);
    double width = CGImageGetWidth(image);


    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];
    CVPixelBufferRef pxbuffer = NULL;


    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
                                          size.height,  kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                          &pxbuffer);
    if (status != kCVReturnSuccess) {
        return NULL;
    }



    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();


    CGContextRef context = CGBitmapContextCreate(pxdata,size.width ,
                                                 size.height, 8, 4*size.width, rgbColorSpace,
                                                 kCGImageAlphaNoneSkipFirst);


    CGFloat Y ;
    if (height == size.height)
        Y = 0;

    else
        Y =  (size.height /2) - (height/2) ;


    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));


    CGContextDrawImage(context, CGRectMake(0, Y,width,height), image);
    CGContextSetInterpolationQuality(context, kCGInterpolationHigh);

    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);


    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);




    return pxbuffer;


}

CGContextDrawImage increases the memory by 2~5 MB per frame conversion.

I tried the following solutions:

1- releasing pxbuffer using CFRelease.

2- I used CGImageRelease to release the image ref.

3- I surrounded the code with @autoreleasepool block.

4- I used CGContextRelease.

5- UIGraphicsEndImageContext.

6- Used Analyze too in Xcode and fixed all the points.

Here is the full code for Video filtering:

     - (void)assetFilteringMethod:(FilterType)filterType AndAssetURL:(NSURL *)assetURL{



            CMSampleBufferRef sbuff ;


            [areader addOutput:rout];
            [areader startReading];

            UIImage* bufferedImage;

            while ([areader status] != AVAssetReaderStatusCompleted) {



                    sbuff = [rout copyNextSampleBuffer];

                    if (sbuff == nil)
                            [areader cancelReading];


                    else{



                            if (writerInput.readyForMoreMediaData) {



                                @autoreleasepool {

                                    bufferedImage = [self imageFromSampleBuffer:sbuff];

                                    bufferedImage = [FrameFilterClass convertImageToFilterWithFilterType:filterType andImage: bufferedImage];





                                CVPixelBufferRef buffer = NULL;


                                buffer = [self pixelBufferFromCGImage:[bufferedImage CGImage] andSize:CGSizeMake(320,240)];





                                    [adaptor appendPixelBuffer:buffer withPresentationTime:CMSampleBufferGetPresentationTimeStamp(sbuff)];


                                    CFRelease(buffer);
                                    CFRelease(sbuff);


                                }

                            }











                    }




            }


                //Finished buffering

               [videoWriter finishWritingWithCompletionHandler:^{

                 if (videoWriter.status != AVAssetWriterStatusFailed && videoWriter.status == AVAssetWriterStatusCompleted){

 dispatch_async(dispatch_get_main_queue(), ^{


        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
                   if ([library 
videoAtPathIsCompatibleWithSavedPhotosAlbum:[NSURL fileURLWithPath:moviePath]]) {
                                               [library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:moviePath]
                                                                                                          completionBlock:^(NSURL *assetURL, NSError *error){

                                                   }];

                                               }
                                      });

                                        }
                                       else
                                         NSLog(@"Video writing failed: %@", videoWriter.error);

                     }];


        }

I spent around 3 to 4 days trying to solve this problem... Any help would be appreciated.

1

1 Answers

0
votes

You have to release the image using this line:

cgimagerelease(image.cgimage)