26
votes

We add subtitles to a video recorded by the user, but the export by our AVAssetExportSession object fails non-deterministically: sometimes it works, and sometimes it doesn't. It's unclear even how to reproduce the error.

We noticed the asset tracks seem to get lost during export.

Before exporting, there are two tracks (one for audio, one for video) as expected. But checking the number of tracks for the same file URL in exportDidFinish shows 0 tracks. So something seems wrong with the export process.

Update: Commenting out exporter.videoComposition = mutableComposition fixes the error, but of course no transforms are applied to the video. So the problem seems to lie in creating AVMutableVideoComposition, which causes problems downstream during export. Documentation and tutorials on AVMutableVideoComposition are sparse, so even if you don't have a solution but could recommend sources for reference beyond Apple, that would be helpful.

Error:

Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo=0x170676e80 {NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.}

Code:

    let videoAsset = AVURLAsset(URL: fileUrl, options: nil)
    let mixComposition = AVMutableComposition()
    let videoTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID(kCMPersistentTrackID_Invalid))
    let audioTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID(kCMPersistentTrackID_Invalid))

    let sourceVideoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0] as! AVAssetTrack
    let sourceAudioTrack = videoAsset.tracksWithMediaType(AVMediaTypeAudio)[0] as! AVAssetTrack
    videoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), ofTrack: sourceVideoTrack, atTime: kCMTimeZero, error: nil)
    audioTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), ofTrack: sourceAudioTrack, atTime: kCMTimeZero, error: nil)

    // Create something mutable???
    // -- Create instruction
    let instruction = AVMutableVideoCompositionInstruction()
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
    let videoLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: sourceVideoTrack)
    instruction.layerInstructions = [videoLayerInstruction]

    let mutableComposition = AVMutableVideoComposition()
    //mutableComposition.renderSize = videoTrack.naturalSize
    mutableComposition.renderSize = CGSize(width: 320, height: 320)
    mutableComposition.frameDuration = CMTimeMake(1, 60)
    mutableComposition.instructions = [instruction]

    // Animate
    mutableComposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, inLayer: parentLayer)

    // -- Get path
    let fileName = "/editedVideo-\(arc4random() % 10000).mp4"
    let allPaths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)
    let docsPath = allPaths[0] as! NSString
    let exportPath = docsPath.stringByAppendingFormat(fileName)
    let exportUrl = NSURL.fileURLWithPath(exportPath as String)!

    println("Tracks before export: \(mixComposition.tracks.count). File URL: \(exportUrl)")

    // -- Remove old video?
    if NSFileManager.defaultManager().fileExistsAtPath(exportPath as String) {
        println("Deleting existing file\n")
        NSFileManager.defaultManager().removeItemAtPath(exportPath as String, error: nil)
    }

    // -- Create exporter
    let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
    exporter.videoComposition = mutableComposition
    exporter.outputFileType = AVFileTypeMPEG4
    exporter.outputURL = exportUrl
    exporter.shouldOptimizeForNetworkUse = true

    // -- Export video
    exporter.exportAsynchronouslyWithCompletionHandler({
        self.exportDidFinish(exporter)
    })


func exportDidFinish(exporter: AVAssetExportSession) {
    println("Exported video with status: \(getExportStatus(exporter))")

    // Save video to photo album
    let assetLibrary = ALAssetsLibrary()
    assetLibrary.writeVideoAtPathToSavedPhotosAlbum(exporter.outputURL, completionBlock: {(url: NSURL!, error: NSError!) in
        println("Saved video to album \(exporter.outputURL)")
        if (error != nil) {
            println("Error saving video")
        }
    })

    // Check asset tracks
    let asset = AVAsset.assetWithURL(exporter.outputURL) as? AVAsset
    println("Tracks after export: \(asset!.tracks.count). File URL: \(exporter.outputURL)")
}

Questions:

1) What's causing the problem, and what's the solution?

2) Suggestions on how to reproduce the error consistently, which hopefully helps debug the problem?

5
Consider adding error handling to your insertTimeRange calls and your NSFileManager calls; setting a returned error to nil is never a good idea, since if something goes wrong, you won't hear about it. Personally, I wouldn't even consider working on this code until that's done: anyone who blithely throws away error-checking is just being willfully silly. Then consider adding more logging to the process so that you know whether everything has gone okay to the point where the export starts. - Also I wonder about your randomization approach to file names, but I doubt that that's the issue..matt
@matt you're 100% right on the error messages. Not sure, though, what other logging to include beyond verifying that multiple tracks exist -- what do you recommend?Crashalot
@matt added the error messages, but nothing gets triggered with "insertTimeRange." The NSFileManager error is not relevant because we actually comment out that line since we're using a random filename scheme during testing and monitoring the logs to ensure filename conflicts aren't a problem (i.e., "Deleting existing file" never gets printed).Crashalot
Could this have something to do with the size of the video (using up resources) or running out of memory on the device?Jordan
@Jordan The videos are only 2 seconds long. How could you check this? There are no memory warnings during the export process.Crashalot

5 Answers

30
votes

What seems to be the cure is making sure the assetTrack parameter in AVMutableVideoCompositionLayerInstruction is not from the AVURLAsset object, but from the video object returned by addMutableTrackWithMediaType.

In other words, this line:

let videoLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: sourceVideoTrack)

Should be:

let videoLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)

Argh. Hours of endless frustration because sometimes the first line worked, and sometimes it didn't.

Still would like to award the bounty to someone.

If you can explain why the first line failed non-deterministically, instead of every time, or provide a deeper tutorial into AVMutableComposition and its related classes -- for the purposes of adding text overlays to user-recorded videos -- the bounty is all yours. :)

7
votes

I'm guessing that some of your videos' sourceVideoTracks are either:

  • tracks that are non contiguous
  • tracks with time range shorter than the video's whole time range

The mutable track videoTrack, on the other hand, is guaranteed the correct time range (as instructed by the AVMutableVideoCompositionInstruction) so it always works.

7
votes

I resolved this problem by using the AVAssetExportPresetPassthrough export preset rather than using a specific resolution or AVAssetExportHighestQuality

let exportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetPassthrough)

This should use the resolution of the imported video in the exported file.

1
votes

Late to the party, but here's what worked for me. The export would fail "randomly". So then I debug the length of the video track and the length of the audio track.

I noticed that when the audio track was longer than the video track the export would fail.

So I made this change:

let assetVideoTrack = asset.tracks(withMediaType: .video).first!
let assetAudioTrack = asset.tracks(withMediaType: .audio).first!


var validTimeRange:CMTimeRange
if assetVideoTrack.timeRange.duration.value > assetAudioTrack.timeRange.duration.value {
    validTimeRange = assetVideoTrack.timeRange
} else {
    validTimeRange = assetAudioTrack.timeRange
}

So then I would use that value here:

let instruction = AVMutableVideoCompositionInstruction()
instruction.layerInstructions = [layerInstructions]
instruction.timeRange = validTimeRange

This has solved the problem for me. Works 100% of the times now.

Exported video looks good and recorded audio with it sounds great.

The answer to the questions:

1) What's causing the problem, and what's the solution?

2) Suggestions on how to reproduce the error consistently, which hopefully helps debug the problem?

For me are the following:

  1. slightly different durations between video and audio tracks. Using the shorter time in instruction.timeRange would fail the export.

  2. set instruction.timeRange to the shorter time of the two tracks and the export fails.

0
votes

if U set the width or the height to zero could lead to crash with Operation Stopped, NSLocalizedFailureReason=The video could not be composed

self.mutableVideoComposition.renderSize = CGSizeMake(assetVideoTrack.naturalSize.height,assetVideoTrack.naturalSize.width);