3
votes

I am creating a Picture-In-Picture video, this function has worked flawlessly (as far as I know) for 1.5 years. Now it appears in IOS 11 it only works the first time it is called...when it is called to do a second video (without force closing the app first) I get the error Message below.

I found this article on stack, but I am already using the asset track correctly as per this article: AVAssetExportSession export fails non-deterministically with error: “Operation Stopped, NSLocalizedFailureReason=The video could not be composed.”

I have put the exact method I am using. Any help would be greatly appreciated!

Error Message:

Error: Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" 
UserInfo={NSLocalizedFailureReason=The video could not be composed., 
NSLocalizedDescription=Operation Stopped, 
NSUnderlyingError=0x1c04521e0 
{Error Domain=NSOSStatusErrorDomain Code=-17390 "(null)"}}

Method Below:

- (void) composeVideo:(NSString*)videoPIP onVideo:(NSString*)videoBG
{
@try {
    NSError *e = nil;

    AVURLAsset *backAsset, *pipAsset;

    // Load our 2 movies using AVURLAsset
    pipAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:videoPIP] options:nil];
    backAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:videoBG] options:nil];

    if ([[NSFileManager defaultManager] fileExistsAtPath:videoPIP])
    {
        NSLog(@"PIP File Exists!");
    }
    else
    {
        NSLog(@"PIP File DOESN'T Exist!");
    }

    if ([[NSFileManager defaultManager] fileExistsAtPath:videoBG])
    {
        NSLog(@"BG File Exists!");
    }
    else
    {
        NSLog(@"BG File DOESN'T Exist!");
    }

    float scaleH = VIDEO_SIZE.height / [[[backAsset tracksWithMediaType:AVMediaTypeVideo ] objectAtIndex:0] naturalSize].width;
    float scaleW = VIDEO_SIZE.width / [[[backAsset tracksWithMediaType:AVMediaTypeVideo ] objectAtIndex:0] naturalSize].height;

    float scalePIP = (VIDEO_SIZE.width * 0.25) / [[[pipAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].width;

    // Create AVMutableComposition Object - this object will hold our multiple AVMutableCompositionTracks.
    AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];

    // Create the first AVMutableCompositionTrack by adding a new track to our AVMutableComposition.
    AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

    // Set the length of the firstTrack equal to the length of the firstAsset and add the firstAsset to our newly created track at kCMTimeZero so video plays from the start of the track.
    [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, pipAsset.duration) ofTrack:[[pipAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&e];
    if (e)
    {
        NSLog(@"Error0: %@",e);
        e = nil;
    }

    // Repeat the same process for the 2nd track and also start at kCMTimeZero so both tracks will play simultaneously.
    AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, backAsset.duration) ofTrack:[[backAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&e];

    if (e)
    {
        NSLog(@"Error1: %@",e);
        e = nil;
    }

    // We also need the audio track!
    AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, backAsset.duration) ofTrack:[[backAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:&e];
    if (e)
    {
        NSLog(@"Error2: %@",e);
        e = nil;
    }


    // Create an AVMutableVideoCompositionInstruction object - Contains the array of AVMutableVideoCompositionLayerInstruction objects.
    AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

    // Set Time to the shorter Asset.
    MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, (pipAsset.duration.value > backAsset.duration.value) ? pipAsset.duration : backAsset.duration);

    // Create an AVMutableVideoCompositionLayerInstruction object to make use of CGAffinetransform to move and scale our First Track so it is displayed at the bottom of the screen in smaller size.
    AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];

    //CGAffineTransform Scale1 = CGAffineTransformMakeScale(0.3f,0.3f);
    CGAffineTransform Scale1 = CGAffineTransformMakeScale(scalePIP, scalePIP);

    // Top Left
    CGAffineTransform Move1 = CGAffineTransformMakeTranslation(3.0, 3.0);

    [FirstlayerInstruction setTransform:CGAffineTransformConcat(Scale1,Move1) atTime:kCMTimeZero];

    // Repeat for the second track.
    AVMutableVideoCompositionLayerInstruction *SecondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];

    CGAffineTransform Scale2 = CGAffineTransformMakeScale(scaleW, scaleH);
    CGAffineTransform rotateBy90Degrees = CGAffineTransformMakeRotation( M_PI_2);
    CGAffineTransform Move2 = CGAffineTransformMakeTranslation(0.0, ([[[backAsset tracksWithMediaType:AVMediaTypeVideo ] objectAtIndex:0] naturalSize].height) * -1);

    [SecondlayerInstruction setTransform:CGAffineTransformConcat(Move2, CGAffineTransformConcat(rotateBy90Degrees, Scale2)) atTime:kCMTimeZero];

    // Add the 2 created AVMutableVideoCompositionLayerInstruction objects to our AVMutableVideoCompositionInstruction.
    MainInstruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction, SecondlayerInstruction, nil];

    // Create an AVMutableVideoComposition object.
    AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
    MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
    MainCompositionInst.frameDuration = CMTimeMake(1, 30);


    // Set the render size to the screen size.
    //        MainCompositionInst.renderSize = [[UIScreen mainScreen] bounds].size;
    MainCompositionInst.renderSize = VIDEO_SIZE;


    NSString  *fileName = [NSString stringWithFormat:@"%@%@", NSTemporaryDirectory(), @"fullreaction.MP4"];

    // Make sure the video doesn't exist.
    if ([[NSFileManager defaultManager] fileExistsAtPath:fileName])
    {
        [[NSFileManager defaultManager] removeItemAtPath:fileName error:nil];
    }

    // Now we need to save the video.
    NSURL *url = [NSURL fileURLWithPath:fileName];
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                      presetName:QUALITY];
    exporter.videoComposition = MainCompositionInst;
    exporter.outputURL=url;
    exporter.outputFileType = AVFileTypeMPEG4;


    [exporter exportAsynchronouslyWithCompletionHandler:
     ^(void )
     {
         NSLog(@"File Saved as %@!", fileName);
         NSLog(@"Error: %@", exporter.error);
         [self performSelectorOnMainThread:@selector(runProcessingComplete) withObject:nil waitUntilDone:false];
     }];

}
@catch (NSException *ex) {
    UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error 3" message:[NSString stringWithFormat:@"%@",ex]
                                                   delegate:self cancelButtonTitle:@"OK" otherButtonTitles: nil];
    [alert show];
}


}
1
Spoke with apple development support and after some back and forth it is suspected this is an IOS bug. I was asked to enter a bug report to apple. They are hoping the engineers will provide a work around, will keep this updated as I get answers.berr08
Any updates? I'm really stuck on this!Ed Filowat
Sorry Yes I have a solution for my problem, apple bug team got back to me with the root cause. Will post now.berr08

1 Answers

3
votes

The Cause: It ends up the "MainInstruction" timeRange is incorrect.

CMTime objects cannot be compared using "value". Instead, you must use CMTIME_COMPARE_INLINE.

To fix, replace this line:

MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, (pipAsset.duration.value > backAsset.duration.value) ? pipAsset.duration : backAsset.duration);

With this line:

MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTIME_COMPARE_INLINE(pipAsset.duration, >, backAsset.duration) ? pipAsset.duration : backAsset.duration);