3
votes

I'm capturing the camera feed and writing it to a movie. The problem I'm having is that after the export the movie has a couple of black seconds in front of it (relative to the actual recording start time).

I think this is related to [self.assetWriter startSessionAtSourceTime:kCMTimeZero]; I had a half working solution by having a frameStart variable that just counted upwards in the samplebuffer delegate method.

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    frameStart++;
    if (self.startRecording == YES) {

        static int64_t frameNumber = 0;
        if(self.assetWriterInput.readyForMoreMediaData) {
            [self.pixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:CMTimeMake(frameNumber, 25)];
        }
        frameNumber++;
    }
}

and then call this method when the user pressed a button:

[self.assetWriter startSessionAtSourceTime:CMTimeMake(frameStart,25)];

this works. but only once... if I want to record a second movie the black frames are back again.

Also, when I look at the outputted movie the frame rate is 25fps like I want it to. but the video looks as if it's sped up. as if there is not enough space between the frames. So the movie plays about twice as fast.

NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:640], AVVideoWidthKey, [NSNumber numberWithInt:480], AVVideoHeightKey, AVVideoCodecH264, AVVideoCodecKey, nil];

self.assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
self.assetWriterInput.expectsMediaDataInRealTime = YES;
2

2 Answers

3
votes

You don't need to count frame timestamps on your own. You can get the timestamp of the current sample with

CMTime timestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

However, it seems to me you are just passing the pixel buffer of the frame to the adaptor without modifications. Wouldn't it be easier to pass the sample buffer itself directly to the assetWriterInput like the following?

[self.assetWriterInput appendSampleBuffer:sampleBuffer];
0
votes

First of all, why are you incrementing frameNumber twice for every frame? Increment once, remove the first one. This should fix the playback speed.

Second, are you resetting frameNumber to 0 when you finish recording? If not than this is your problem. If not I need more explanation about what is going on here..

Regards