12
votes

First time asking a question here. I'm hoping the post is clear and sample code is formatted correctly.

I'm experimenting with AVFoundation and time lapse photography.

My intent is to grab every Nth frame from the video camera of an iOS device (my iPod touch, version 4) and write each of those frames out to a file to create a timelapse. I'm using AVCaptureVideoDataOutput, AVAssetWriter and AVAssetWriterInput.

The problem is, if I use the CMSampleBufferRef passed to

captureOutput:idOutputSampleBuffer:fromConnection:
, the playback of each frame is the length of time between original input frames. A frame rate of say 1fps. I'm looking to get 30fps.

I've tried using

CMSampleBufferCreateCopyWithNewTiming()
, but then after 13 frames are written to the file, the
captureOutput:idOutputSampleBuffer:fromConnection:
stops being called. The interface is active and I can tap a button to stop the capture and save it to the photo library for playback. It appears to play back as I want it, 30fps, but it only has those 13 frames.

How can I accomplish my goal of 30fps playback? How can I tell where the app is getting lost and why?

I've placed a flag called useNativeTime so I can test both cases. When set to YES, I get all frames I'm interested in as the callback doesn't 'get lost'. When I set that flag to NO, I only ever get 13 frames processed and am never returned to that method again. As mentioned above, in both cases I can playback the video.

Thanks for any help.

Here is where I'm trying to do the retiming.

    - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    BOOL useNativeTime = NO;
    BOOL appendSuccessFlag = NO;

    //NSLog(@"in captureOutpput sample buffer method");
    if( !CMSampleBufferDataIsReady(sampleBuffer) )
    {
        NSLog( @"sample buffer is not ready. Skipping sample" );
        //CMSampleBufferInvalidate(sampleBuffer);
        return;
    }

    if (! [inputWriterBuffer isReadyForMoreMediaData])
    {
        NSLog(@"Not ready for data.");
    }
    else {
        // Write every first frame of n frames (30 native from camera). 
        intervalFrames++;
        if (intervalFrames > 30) {
            intervalFrames = 1;
        }
        else if (intervalFrames != 1) {
            //CMSampleBufferInvalidate(sampleBuffer);
            return;
        }

        // Need to initialize start session time.
        if (writtenFrames < 1) {
            if (useNativeTime) imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
            else imageSourceTime = CMTimeMake( 0 * 20 ,600); //CMTimeMake(1,30);
            [outputWriter startSessionAtSourceTime: imageSourceTime];
            NSLog(@"Starting CMtime");
            CMTimeShow(imageSourceTime);
        }

        if (useNativeTime) {
            imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
            CMTimeShow(imageSourceTime);
            // CMTime myTiming = CMTimeMake(writtenFrames * 20,600);
            // CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, myTiming); // Tried but has no affect.
            appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:sampleBuffer];
        }
        else {
            CMSampleBufferRef newSampleBuffer;
            CMSampleTimingInfo sampleTimingInfo;
            sampleTimingInfo.duration = CMTimeMake(20,600);
            sampleTimingInfo.presentationTimeStamp = CMTimeMake( (writtenFrames + 0) * 20,600);
            sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid;
            OSStatus myStatus;

            //NSLog(@"numSamples of sampleBuffer: %i", CMSampleBufferGetNumSamples(sampleBuffer) );
            myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
                                                             sampleBuffer,
                                                             1,
                                                             &sampleTimingInfo, // maybe a little confused on this param.
                                                             &newSampleBuffer);
            // These confirm the good heath of our newSampleBuffer.
            if (myStatus != 0) NSLog(@"CMSampleBufferCreateCopyWithNewTiming() myStatus: %i",myStatus);
            if (! CMSampleBufferIsValid(newSampleBuffer)) NSLog(@"CMSampleBufferIsValid NOT!");

            // No affect.
            //myStatus = CMSampleBufferMakeDataReady(newSampleBuffer);  // How is this different; CMSampleBufferSetDataReady ?
            //if (myStatus != 0) NSLog(@"CMSampleBufferMakeDataReady() myStatus: %i",myStatus);

            imageSourceTime = CMSampleBufferGetPresentationTimeStamp(newSampleBuffer);
            CMTimeShow(imageSourceTime);
            appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:newSampleBuffer];
            //CMSampleBufferInvalidate(sampleBuffer); // Docs don't describe action. WTF does it do? Doesn't seem to affect my problem. Used with CMSampleBufferSetInvalidateCallback maybe?
            //CFRelease(sampleBuffer); // - Not surprisingly - “EXC_BAD_ACCESS”
        }

        if (!appendSuccessFlag)
        {
            NSLog(@"Failed to append pixel buffer");
        }
        else {
            writtenFrames++;
            NSLog(@"writtenFrames: %i", writtenFrames);
            }
    }

    //[self displayOuptutWritterStatus];    // Expect and see AVAssetWriterStatusWriting.
}

My setup routine.

    - (IBAction) recordingStartStop: (id) sender
{
    NSError * error;

    if (self.isRecording) {
        NSLog(@"~~~~~~~~~ STOPPING RECORDING ~~~~~~~~~");
        self.isRecording = NO;
        [recordingStarStop setTitle: @"Record" forState: UIControlStateNormal];

        //[self.captureSession stopRunning];
        [inputWriterBuffer markAsFinished];
        [outputWriter endSessionAtSourceTime:imageSourceTime];
        [outputWriter finishWriting]; // Blocks until file is completely written, or an error occurs.
        NSLog(@"finished CMtime");
        CMTimeShow(imageSourceTime);

        // Really, I should loop through the outputs and close all of them or target specific ones.
        // Since I'm only recording video right now, I feel safe doing this.
        [self.captureSession removeOutput: [[self.captureSession outputs] objectAtIndex: 0]];

        [videoOutput release];
        [inputWriterBuffer release];
        [outputWriter release];
        videoOutput = nil;
        inputWriterBuffer = nil;
        outputWriter = nil;
        NSLog(@"~~~~~~~~~ STOPPED RECORDING ~~~~~~~~~");
        NSLog(@"Calling UIVideoAtPathIsCompatibleWithSavedPhotosAlbum.");
        NSLog(@"filePath: %@", [projectPaths movieFilePath]);
        if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([projectPaths movieFilePath])) {
            NSLog(@"Calling UISaveVideoAtPathToSavedPhotosAlbum.");
            UISaveVideoAtPathToSavedPhotosAlbum ([projectPaths movieFilePath], self, @selector(video:didFinishSavingWithError: contextInfo:), nil);
        }
        NSLog(@"~~~~~~~~~ WROTE RECORDING to PhotosAlbum ~~~~~~~~~");
    }
    else {
        NSLog(@"~~~~~~~~~ STARTING RECORDING ~~~~~~~~~");
        projectPaths = [[ProjectPaths alloc] initWithProjectFolder: @"TestProject"];
        intervalFrames = 30;

        videoOutput = [[AVCaptureVideoDataOutput alloc] init];
        NSMutableDictionary * cameraVideoSettings = [[[NSMutableDictionary alloc] init] autorelease];
        NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
        NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_32BGRA]; //kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];
        [cameraVideoSettings setValue: value forKey: key];
        [videoOutput setVideoSettings: cameraVideoSettings];
        [videoOutput setMinFrameDuration: CMTimeMake(20, 600)]; //CMTimeMake(1, 30)]; // 30fps
        [videoOutput setAlwaysDiscardsLateVideoFrames: YES];

        queue = dispatch_queue_create("cameraQueue", NULL);
        [videoOutput setSampleBufferDelegate: self queue: queue];
        dispatch_release(queue);

        NSMutableDictionary *outputSettings = [[[NSMutableDictionary alloc] init] autorelease];
        [outputSettings setValue: AVVideoCodecH264 forKey: AVVideoCodecKey];
        [outputSettings setValue: [NSNumber numberWithInt: 1280] forKey: AVVideoWidthKey]; // currently assuming
        [outputSettings setValue: [NSNumber numberWithInt: 720] forKey: AVVideoHeightKey];

        NSMutableDictionary *compressionSettings = [[[NSMutableDictionary alloc] init] autorelease];
        [compressionSettings setValue: AVVideoProfileLevelH264Main30 forKey: AVVideoProfileLevelKey];
        //[compressionSettings setValue: [NSNumber numberWithDouble:1024.0*1024.0] forKey: AVVideoAverageBitRateKey];
        [outputSettings setValue: compressionSettings forKey: AVVideoCompressionPropertiesKey];

        inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings];
        [inputWriterBuffer retain];
        inputWriterBuffer.expectsMediaDataInRealTime = YES;

        outputWriter = [AVAssetWriter assetWriterWithURL: [projectPaths movieURLPath] fileType: AVFileTypeQuickTimeMovie error: &error];
        [outputWriter retain];

        if (error) NSLog(@"error for outputWriter = [AVAssetWriter assetWriterWithURL:fileType:error:");
        if ([outputWriter canAddInput: inputWriterBuffer]) [outputWriter addInput: inputWriterBuffer];
        else NSLog(@"can not add input");

        if (![outputWriter canApplyOutputSettings: outputSettings forMediaType:AVMediaTypeVideo]) NSLog(@"ouptutSettings are NOT supported");

        if ([captureSession canAddOutput: videoOutput]) [self.captureSession addOutput: videoOutput];
        else NSLog(@"could not addOutput: videoOutput to captureSession");

        //[self.captureSession startRunning];
        self.isRecording = YES;
        [recordingStarStop setTitle: @"Stop" forState: UIControlStateNormal];

        writtenFrames = 0;
        imageSourceTime = kCMTimeZero;
        [outputWriter startWriting];
        //[outputWriter startSessionAtSourceTime: imageSourceTime];
        NSLog(@"~~~~~~~~~ STARTED RECORDING ~~~~~~~~~");
        NSLog (@"recording to fileURL: %@", [projectPaths movieURLPath]);
    }

    NSLog(@"isRecording: %@", self.isRecording ? @"YES" : @"NO");

    [self displayOuptutWritterStatus];  
}
2

2 Answers

12
votes

OK, I found the bug in my first post.

When using

myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
                                                 sampleBuffer,
                                                 1,
                                                 &sampleTimingInfo, 
                                                 &newSampleBuffer);

you need to balance that with a CFRelease(newSampleBuffer);

The same idea holds true when using a CVPixelBufferRef with a piexBufferPool of an AVAssetWriterInputPixelBufferAdaptor instance. You would use CVPixelBufferRelease(yourCVPixelBufferRef); after calling the appendPixelBuffer: withPresentationTime: method.

Hope this is helpful to someone else.

3
votes

With a little more searching and reading I have a working solution. Don't know that it is best method, but so far, so good.

In my setup area I've setup an AVAssetWriterInputPixelBufferAdaptor. The code addition looks like this.

InputWriterBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor
            assetWriterInputPixelBufferAdaptorWithAssetWriterInput: inputWriterBuffer
            sourcePixelBufferAttributes: nil];
[inputWriterBufferAdaptor retain];

For completeness to understand the code below, I also have these three lines in the setup method.

fpsOutput = 30; //Some possible values: 30, 10, 15 24, 25, 30/1.001 or 29.97;
cmTimeSecondsDenominatorTimescale = 600 * 100000; //To more precisely handle 29.97.
cmTimeNumeratorValue = cmTimeSecondsDenominatorTimescale / fpsOutput;

Instead of applying a retiming to a copy of the sample buffer. I now have the following three lines of code that effectively does the same thing. Notice the withPresentationTime parameter for the adapter. By passing my custom value to that, I gain the correct timing I'm seeking.

CVPixelBufferRef myImage = CMSampleBufferGetImageBuffer( sampleBuffer );
imageSourceTime = CMTimeMake( writtenFrames * cmTimeNumeratorValue, cmTimeSecondsDenominatorTimescale);
appendSuccessFlag = [inputWriterBufferAdaptor appendPixelBuffer: myImage withPresentationTime: imageSourceTime];

Use of the AVAssetWriterInputPixelBufferAdaptor.pixelBufferPool property may have some gains, but I haven't figured that out.