3
votes

In my app, I'm recording small videos and adding them into an NSMutableArray as AVAsset so that i keep record of what has been captured. when the user press a button to merge them, the final result is only the first video taken (example, if three short videos where taken, the final result after merging is only the first video and the others do not appear). my code on iterating in the NSMutableArray and stitching the videos together is here:

if (self.capturedVideos.count != 0) {        
    //Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
    AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];

    for (AVAsset *asset in self.capturedVideos) {
        //check if the video is the first one captures so that it  is placed at time 0.
        if ([self.capturedVideos indexOfObject:asset] == 0) {
            AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
            [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
            previousAsset = asset;
        } else{
            AVMutableCompositionTrack *track = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
            [track insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:previousAsset.duration error:nil];
            previousAsset = asset;
        }
    }

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:
    [NSString stringWithFormat:@"mergeVideo-%d.mov",arc4random() % 1000]];
    NSURL *url = [NSURL fileURLWithPath:myPathDocs];
    // 5 - Create exporter
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
    exporter.outputURL=url;
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            [self exportDidFinish:exporter];
        });
    }];
}

what's after the for loop is for exporting the video to be saved in camera roll. so where is my mistake? the durations are right (so there is no over lapping). however, i'm doubting in something. There is an instance variable i added after @implementation in braces which is previousAsset which tracks the previous asset added thus knowing where to place the next one. it's of class AVAsset so i didn't initialize it because when i try to it's showing me an error.

previousAsset = [[AVAsset alloc] init];

4

4 Answers

6
votes

Swift version

func merge(arrayVideos:[AVAsset], completion:@escaping (_ exporter: AVAssetExportSession) -> ()) -> Void {

  let mainComposition = AVMutableComposition()
  let compositionVideoTrack = mainComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
  compositionVideoTrack?.preferredTransform = CGAffineTransform(rotationAngle: .pi / 2)

  let soundtrackTrack = mainComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)

  var insertTime = kCMTimeZero

  for videoAsset in arrayVideos {
    try! compositionVideoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAsset.tracks(withMediaType: .video)[0], at: insertTime)
    try! soundtrackTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAsset.tracks(withMediaType: .audio)[0], at: insertTime)

    insertTime = CMTimeAdd(insertTime, videoAsset.duration)
  }

  let outputFileURL = URL(fileURLWithPath: NSTemporaryDirectory() + "merge.mp4")

  let fileManager = FileManager()
  fileManager.removeItemIfExisted(outputFileURL)

  let exporter = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality)

  exporter?.outputURL = outputFileURL
  exporter?.outputFileType = AVFileType.mp4
  exporter?.shouldOptimizeForNetworkUse = true

  exporter?.exportAsynchronously {
    DispatchQueue.main.async {
      completion(exporter!)
    }
  }
}
3
votes

This will work fine

      AVMutableComposition *mainComposition = [[AVMutableComposition alloc] init];
      AVMutableCompositionTrack *compositionVideoTrack = [mainComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];


      AVMutableCompositionTrack *soundtrackTrack = [mainComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
      CMTime insertTime = kCMTimeZero;

      for (AVAsset *videoAsset in assets) {

          [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:insertTime error:nil];

          [soundtrackTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:insertTime error:nil];

          // Updating the insertTime for the next insert
          insertTime = CMTimeAdd(insertTime, videoAsset.duration);
      }

      NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
      NSString *documentsDirectory = [paths objectAtIndex:0];

      // Creating a full path and URL to the exported video
      NSString *outputVideoPath =  [documentsDirectory stringByAppendingPathComponent:
                              [NSString stringWithFormat:@"mergeVideo-%d.mov",arc4random() % 1000]];

      // NSString *documentsDirectory = [paths objectAtIndex:0];
      NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:
                         current_name];
      NSURL *outptVideoUrl = [NSURL fileURLWithPath:myPathDocs];
      AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mainComposition presetName:AVAssetExportPreset640x480];

      // Setting attributes of the exporter
      exporter.outputURL=outptVideoUrl;
      exporter.outputFileType =AVFileTypeMPEG4;   //AVFileTypeQuickTimeMovie;
      exporter.shouldOptimizeForNetworkUse = YES;
      [exporter exportAsynchronouslyWithCompletionHandler:^{
          dispatch_async(dispatch_get_main_queue(), ^{
              //completion(exporter);
              [self exportDidFinish:exporter];
              // [self exportDidFinish:exporter:assets];
          });
      }];

this will work fine..

2
votes

updated @brenoxp's answer for Swift 5.1

func merge(arrayVideos:[AVAsset], completion:@escaping (URL?, Error?) -> ()) {

  let mainComposition = AVMutableComposition()
  let compositionVideoTrack = mainComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
  compositionVideoTrack?.preferredTransform = CGAffineTransform(rotationAngle: .pi / 2)

  let soundtrackTrack = mainComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)

    var insertTime = CMTime.zero

  for videoAsset in arrayVideos {
    try! compositionVideoTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.duration), of: videoAsset.tracks(withMediaType: .video)[0], at: insertTime)
    try! soundtrackTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.duration), of: videoAsset.tracks(withMediaType: .audio)[0], at: insertTime)

    insertTime = CMTimeAdd(insertTime, videoAsset.duration)
  }

  let outputFileURL = URL(fileURLWithPath: NSTemporaryDirectory() + "merge.mp4")

  let fileManager = FileManager()
  try? fileManager.removeItem(at: outputFileURL)

  let exporter = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality)

  exporter?.outputURL = outputFileURL
  exporter?.outputFileType = AVFileType.mp4
  exporter?.shouldOptimizeForNetworkUse = true

  exporter?.exportAsynchronously {
    if let url = exporter?.outputURL{
        completion(url, nil)
    }
    if let error = exporter?.error {
        completion(nil, error)
    }
  }
}
1
votes

There is a great example project on Git Hub that is a really good starting point for how to do this in a more reusable way within an app

https://github.com/khoavd-dev/MergeVideos/blob/master/MergeVideos/VideoManager/KVVideoManager.swift