What I wanted: Insert multiple videos layers with some opacity ALL at time 0:00 of a AVVideoCompositionTrack.
I read carefully official AVFoundation documents and also many WWDC discussion on this topic. But I couldn't understand why the result is NOT following the API statements.
I can achieve the overlay result with 2 AVPlayerLayer during playback. That could also mean I can use AVVideoCompositionCoreAnimationTool to achieve similar stuff during export. But I tend to leave CALayer for the subtitles/image overlays or the animations.
What I tried for any inserting AVAsset:
- (void)addVideo:(AVAsset *)asset_in withOpacity:(float)opacity
{
// This is demo for composition of opaque videos. So we all insert video at time - 0:00
[_videoCompositionTrack insertTimeRange:CMTimeRangeMake( kCMTimeZero, asset_in.duration )
ofTrack:[ [asset_in tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ]
atTime:kCMTimeZero error:nil ];
AVMutableVideoCompositionInstruction *mutableVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
AVAssetTrack *assettrack_in = [ [asset_in tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ];
mutableVideoCompositionInstruction.timeRange = CMTimeRangeMake( kCMTimeZero, assettrack_in.timeRange.duration );
AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:_videoCompositionTrack];
[videoCompositionLayerInstruction setTransform:assettrack_in.preferredTransform atTime:kCMTimeZero];
[videoCompositionLayerInstruction setOpacity:opacity atTime:kCMTimeZero];
mutableVideoCompositionInstruction.layerInstructions = @[videoCompositionLayerInstruction];
[_arrayVideoCompositionInstructions addObject:mutableVideoCompositionInstruction];
}
Please be aware that insertTimeRange has atTime:kCMTimeZero as parameter. So I expect they will be put at the beginning of the video composition.
What I tried for exporting:
- (IBAction)ExportAndPlay:(id)sender
{
_mutableVideoComposition.instructions = [_arrayVideoCompositionInstructions copy];
// Create a static date formatter so we only have to initialize it once.
static NSDateFormatter *kDateFormatter;
if (!kDateFormatter) {
kDateFormatter = [[NSDateFormatter alloc] init];
kDateFormatter.dateStyle = NSDateFormatterMediumStyle;
kDateFormatter.timeStyle = NSDateFormatterShortStyle;
}
// Create the export session with the composition and set the preset to the highest quality.
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:_mutableComposition presetName:AVAssetExportPresetHighestQuality];
// Set the desired output URL for the file created by the export process.
exporter.outputURL = [[[[NSFileManager defaultManager] URLForDirectory:NSDocumentDirectory inDomain:NSUserDomainMask appropriateForURL:nil create:@YES error:nil] URLByAppendingPathComponent:[kDateFormatter stringFromDate:[NSDate date]]] URLByAppendingPathExtension:CFBridgingRelease(UTTypeCopyPreferredTagWithClass((CFStringRef)AVFileTypeQuickTimeMovie, kUTTagClassFilenameExtension))];
// Set the output file type to be a QuickTime movie.
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = _mutableVideoComposition;
_mutableVideoComposition.instructions = [_arrayVideoCompositionInstructions copy];
// Asynchronously export the composition to a video file and save this file to the camera roll once export completes.
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
switch ([exporter status]) {
case AVAssetExportSessionStatusFailed:
{
NSLog(@"Export failed: %@ %@", [[exporter error] localizedDescription],[[exporter error]debugDescription]);
}
case AVAssetExportSessionStatusCancelled:
{
NSLog(@"Export canceled");
break;
}
case AVAssetExportSessionStatusCompleted:
{
NSLog(@"Export complete!");
NSLog( @"Export URL = %@", [exporter.outputURL absoluteString] );
[self altPlayWithUrl:exporter.outputURL];
}
default:
{
NSLog(@"default");
}
}
} );
}];
}
What turns out: It export a video with second video appended after first video, if I select 2 video clips.
This is not the same behaviour from what I read from :AVMutableCompositionTrack
May anyone shed some light for this helpless lamb?
Edit: Is there any detail missing so that no one can lend me a hand? If so, please leave comment so I can make them up.