Let me see if I understood it correctly.
At the present most advanced hardware, iOS allows me to record at the following fps: 30, 60, 120 and 240.
But these fps behave differently. If I shoot at 30 or 60 fps, I expect the videos files created from shooting at these fps to play at 30 and 60 fps respectively.
But if I shoot at 120 or 240 fps, I expect the video files creating from shooting at these fps to play at 30 fps, or I will not see the slow motion.
A few questions:
- am I right?
- is there a way to shoot at 120 or 240 fps and play at 120 and 240 fps respectively? I mean play at the fps the videos were shoot without slo-mo?
- How do I control that framerate when I write the file?
I am creating the AVAssetWriter input like this...
NSDictionary *videoCompressionSettings = @{AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : @(videoWidth),
AVVideoHeightKey : @(videoHeight),
AVVideoCompressionPropertiesKey : @{ AVVideoAverageBitRateKey : @(bitsPerSecond),
AVVideoMaxKeyFrameIntervalKey : @(1)}
};
_assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings];
and there is no apparent way to control that.
NOTE: I have tried different numbers where that 1
is. I have tried 1.0/fps
, I have tried fps
and I have removed the key. No difference.
This is how I setup `AVAssetWriter:
AVAssetWriter *newAssetWriter = [[AVAssetWriter alloc] initWithURL:_movieURL fileType:AVFileTypeQuickTimeMovie
error:&error];
_assetWriter = newAssetWriter;
_assetWriter.shouldOptimizeForNetworkUse = NO;
CGFloat videoWidth = size.width;
CGFloat videoHeight = size.height;
NSUInteger numPixels = videoWidth * videoHeight;
NSUInteger bitsPerSecond;
// Assume that lower-than-SD resolutions are intended for streaming, and use a lower bitrate
// if ( numPixels < (640 * 480) )
// bitsPerPixel = 4.05; // This bitrate matches the quality produced by AVCaptureSessionPresetMedium or Low.
// else
NSUInteger bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh.
bitsPerSecond = numPixels * bitsPerPixel;
NSDictionary *videoCompressionSettings = @{AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : @(videoWidth),
AVVideoHeightKey : @(videoHeight),
AVVideoCompressionPropertiesKey : @{ AVVideoAverageBitRateKey : @(bitsPerSecond)}
};
if (![_assetWriter canApplyOutputSettings:videoCompressionSettings forMediaType:AVMediaTypeVideo]) {
NSLog(@"Couldn't add asset writer video input.");
return;
}
_assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoCompressionSettings
sourceFormatHint:formatDescription];
_assetWriterVideoInput.expectsMediaDataInRealTime = YES;
NSDictionary *adaptorDict = @{
(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA),
(id)kCVPixelBufferWidthKey : @(videoWidth),
(id)kCVPixelBufferHeightKey : @(videoHeight)
};
_pixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc]
initWithAssetWriterInput:_assetWriterVideoInput
sourcePixelBufferAttributes:adaptorDict];
// Add asset writer input to asset writer
if (![_assetWriter canAddInput:_assetWriterVideoInput]) {
return;
}
[_assetWriter addInput:_assetWriterVideoInput];
captureOutput
method is very simple. I get the image from the filter and write it to file using:
if (videoJustStartWriting)
[_assetWriter startSessionAtSourceTime:presentationTime];
CVPixelBufferRef renderedOutputPixelBuffer = NULL;
OSStatus err = CVPixelBufferPoolCreatePixelBuffer(nil,
_pixelBufferAdaptor.pixelBufferPool,
&renderedOutputPixelBuffer);
if (err) return; // NSLog(@"Cannot obtain a pixel buffer from the buffer pool");
//_ciContext is a metal context
[_ciContext render:finalImage
toCVPixelBuffer:renderedOutputPixelBuffer
bounds:[finalImage extent]
colorSpace:_sDeviceRgbColorSpace];
[self writeVideoPixelBuffer:renderedOutputPixelBuffer
withInitialTime:presentationTime];
- (void)writeVideoPixelBuffer:(CVPixelBufferRef)pixelBuffer withInitialTime:(CMTime)presentationTime
{
if ( _assetWriter.status == AVAssetWriterStatusUnknown ) {
// If the asset writer status is unknown, implies writing hasn't started yet, hence start writing with start time as the buffer's presentation timestamp
if ([_assetWriter startWriting]) {
[_assetWriter startSessionAtSourceTime:presentationTime];
}
}
if ( _assetWriter.status == AVAssetWriterStatusWriting ) {
// If the asset writer status is writing, append sample buffer to its corresponding asset writer input
if (_assetWriterVideoInput.readyForMoreMediaData) {
if (![_pixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime]) {
NSLog(@"error", [_assetWriter.error localizedFailureReason]);
}
}
}
if ( _assetWriter.status == AVAssetWriterStatusFailed ) {
NSLog(@"failed");
}
}
I put the whole thing to shoot at 240 fps. These are presentation times of frames being appended.
time ======= 113594.311510508
time ======= 113594.324011508
time ======= 113594.328178716
time ======= 113594.340679424
time ======= 113594.344846383
if you do some calculation between them you will see that the framerate is about 240 fps. So the frames are being stored with the correct time.
But when I watch the video the movement is not in slow motion and quick time says the video is 30 fps.
Note: this app grabs frames from the camera, the frames goes into CIFilters and the result of those filters is converted back to a sample buffer that is stored to file and displayed on the screen.
AVAssetWriterInputPixelBufferAdaptor
. Do you get the expected frame rate? If you do, look closer at them, if not you've simplified the problem. – Rhythmic Fistman