0
votes

What I wanted: Insert multiple videos layers with some opacity ALL at time 0:00 of a AVVideoCompositionTrack.

I read carefully official AVFoundation documents and also many WWDC discussion on this topic. But I couldn't understand why the result is NOT following the API statements.

I can achieve the overlay result with 2 AVPlayerLayer during playback. That could also mean I can use AVVideoCompositionCoreAnimationTool to achieve similar stuff during export. But I tend to leave CALayer for the subtitles/image overlays or the animations.

What I tried for any inserting AVAsset:

- (void)addVideo:(AVAsset *)asset_in withOpacity:(float)opacity
{
    // This is demo for composition of opaque videos. So we all insert video at time - 0:00
    [_videoCompositionTrack insertTimeRange:CMTimeRangeMake( kCMTimeZero, asset_in.duration )
                                    ofTrack:[ [asset_in tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ]
                                     atTime:kCMTimeZero error:nil ];

    AVMutableVideoCompositionInstruction *mutableVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    AVAssetTrack *assettrack_in = [ [asset_in tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ];
    mutableVideoCompositionInstruction.timeRange = CMTimeRangeMake( kCMTimeZero, assettrack_in.timeRange.duration );
    AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:_videoCompositionTrack];
    [videoCompositionLayerInstruction setTransform:assettrack_in.preferredTransform atTime:kCMTimeZero];
    [videoCompositionLayerInstruction setOpacity:opacity atTime:kCMTimeZero];
    mutableVideoCompositionInstruction.layerInstructions = @[videoCompositionLayerInstruction];
    [_arrayVideoCompositionInstructions addObject:mutableVideoCompositionInstruction];
}

Please be aware that insertTimeRange has atTime:kCMTimeZero as parameter. So I expect they will be put at the beginning of the video composition.

What I tried for exporting:

- (IBAction)ExportAndPlay:(id)sender
{
    _mutableVideoComposition.instructions = [_arrayVideoCompositionInstructions copy];

    // Create a static date formatter so we only have to initialize it once.
    static NSDateFormatter *kDateFormatter;
    if (!kDateFormatter) {
        kDateFormatter = [[NSDateFormatter alloc] init];
        kDateFormatter.dateStyle = NSDateFormatterMediumStyle;
        kDateFormatter.timeStyle = NSDateFormatterShortStyle;
    }
    // Create the export session with the composition and set the preset to the highest quality.
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:_mutableComposition presetName:AVAssetExportPresetHighestQuality];
    // Set the desired output URL for the file created by the export process.
    exporter.outputURL = [[[[NSFileManager defaultManager] URLForDirectory:NSDocumentDirectory inDomain:NSUserDomainMask appropriateForURL:nil create:@YES error:nil] URLByAppendingPathComponent:[kDateFormatter stringFromDate:[NSDate date]]] URLByAppendingPathExtension:CFBridgingRelease(UTTypeCopyPreferredTagWithClass((CFStringRef)AVFileTypeQuickTimeMovie, kUTTagClassFilenameExtension))];
    // Set the output file type to be a QuickTime movie.
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;
    exporter.videoComposition = _mutableVideoComposition;
    _mutableVideoComposition.instructions = [_arrayVideoCompositionInstructions copy];
    // Asynchronously export the composition to a video file and save this file to the camera roll once export completes.
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            switch ([exporter status]) {
                case AVAssetExportSessionStatusFailed:
                {
                    NSLog(@"Export failed: %@ %@", [[exporter error] localizedDescription],[[exporter error]debugDescription]);
                }
                case AVAssetExportSessionStatusCancelled:
                {
                    NSLog(@"Export canceled");
                    break;
                }
                case AVAssetExportSessionStatusCompleted:
                {
                    NSLog(@"Export complete!");
                    NSLog( @"Export URL = %@", [exporter.outputURL absoluteString] );
                    [self altPlayWithUrl:exporter.outputURL];
                }
                default:
                {
                    NSLog(@"default");
                }
            }

        } );
    }];
}

What turns out: It export a video with second video appended after first video, if I select 2 video clips.

This is not the same behaviour from what I read from :AVMutableCompositionTrack

May anyone shed some light for this helpless lamb?

Edit: Is there any detail missing so that no one can lend me a hand? If so, please leave comment so I can make them up.

1

1 Answers

0
votes

Okay, sorry to ask this because of some misunderstanding of API about AVMutableCompositionTrack.

If you want to blend 2 video as 2 overlay as I do. You're going to need 2 AVMutableCompositionTrack instances, both instantiated from the same AVMutableComposition like this:

    // 0. Setup AVMutableCompositionTracks <=  FOR EACH AVAssets !!!
AVMutableCompositionTrack *mutableCompositionVideoTrack1 = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *mutableCompositionVideoTrack2 = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

And insert both AVAssets you want into THEIR OWN AVMutableCompositionTrack:

    AVAssetTrack *videoAssetTrack1 = [ [ [_arrayVideoAssets firstObject] tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ];
AVAssetTrack *videoAssetTrack2 = [ [ [_arrayVideoAssets lastObject] tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ];
    [ mutableCompositionVideoTrack1 insertTimeRange:CMTimeRangeMake( kCMTimeZero, videoAssetTrack1.timeRange.duration ) ofTrack:videoAssetTrack1 atTime:kCMTimeZero error:nil ];
    [ mutableCompositionVideoTrack2 insertTimeRange:CMTimeRangeMake( kCMTimeZero, videoAssetTrack2.timeRange.duration ) ofTrack:videoAssetTrack2 atTime:kCMTimeZero error:nil ];

Then setup the AVMutableVideoComposition with 2 layer instruction of each AVMutableCompositionTracks:

AVMutableVideoCompositionInstruction *compInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
compInstruction.timeRange = CMTimeRangeMake( kCMTimeZero, videoAssetTrack1.timeRange.duration );
AVMutableVideoCompositionLayerInstruction *layerInstruction1 = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mutableCompositionVideoTrack1];
[layerInstruction1 setOpacity:0.5f atTime:kCMTimeZero];

AVMutableVideoCompositionLayerInstruction *layerInstruction2 = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mutableCompositionVideoTrack2];
[layerInstruction2 setOpacity:0.8f atTime:kCMTimeZero];
CGAffineTransform transformScale = CGAffineTransformMakeScale( 0.5f, 0.5f );
CGAffineTransform transformTransition = CGAffineTransformMakeTranslation( videoComposition.renderSize.width / 2,  videoComposition.renderSize.height / 2 );
[ layerInstruction2 setTransform:CGAffineTransformConcat(transformScale, transformTransition) atTime:kCMTimeZero ];
compInstruction.layerInstructions = @[ layerInstruction1, layerInstruction2 ];
videoComposition.instructions = @[ compInstruction ];

Finally, it should be fine during exporting. Sorry to bother if any did take a look.