2
votes

I'm trying to create a video composition in iOS combining both the application of a CIFilter, and also a Core Animation Layer in one pass. Both of these operations work individually, however trying to combine them together in a single pass does not seem to work.

When using the AVMutableVideoComposition(asset:applyingCIFiltersWithHandler:) it seems the animationTool parameter is ignored. Has anyone else experienced this? I've seen some people suggest adding any extra CA layers during the AVMutableVideoComposition callback, however my CALayer has some animations in it, so I can't see how that would work reliably.

Here's the code I'm using:

        let clipVideoTrack = asset.tracks(withMediaType:AVMediaTypeVideo)[0]
        let mixComposition = AVMutableComposition()
        let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
        let videoRange = CMTimeRangeMake(startTime ?? kCMTimeZero, CMTimeSubtract( stopTime ?? asset.duration, startTime ?? kCMTimeZero ) )
        try compositionVideoTrack.insertTimeRange(videoRange, of: clipVideoTrack, at: kCMTimeZero)
        let parentLayer = CALayer()
        let videoLayer = CALayer()
        let overlayLayer = CALayer()

        let targetDimention: CGFloat = 900.0
        let videoWidthDivisor = clipVideoTrack.naturalSize.width / targetDimention
        let actualDimention = clipVideoTrack.naturalSize.width / videoWidthDivisor;
        let targetVideoSize = CGSize(width: actualDimention, height: actualDimention)

        parentLayer.frame = CGRect(x: 0, y: 0, width: targetVideoSize.width, height: targetVideoSize.height)
        videoLayer.frame = CGRect(x: 0, y: 0, width: targetVideoSize.width, height: targetVideoSize.height)
        overlayLayer.frame = CGRect(x: 0, y: 0, width: targetVideoSize.width, height: targetVideoSize.height)

        parentLayer.addSublayer(videoLayer)

        for annotation in mediaAnnotationContainerView.mediaAnnotationViews
        {
            let renderableLayer = annotation.renderableCALayer(targetSize: targetVideoSize)
            parentLayer.addSublayer(renderableLayer)
        }


        let filter = CIFilter(name: "CISepiaTone")!
        filter.setDefaults()
        let videoComp = AVMutableVideoComposition(asset: asset, applyingCIFiltersWithHandler:
        {   request in
            let source = request.sourceImage.clampingToExtent()
            filter.setValue(source, forKey: kCIInputImageKey)
            let output = filter.outputImage!.cropping(to: request.sourceImage.extent)
            request.finish(with: output, context: nil)
        })

        videoComp.renderSize = targetVideoSize

        videoComp.frameDuration = CMTimeMake(1, 30)
        videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)

        let url = AVAsset.tempMovieUrl

        let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
        exporter?.outputURL = url
        exporter?.outputFileType = AVFileTypeMPEG4
        exporter?.shouldOptimizeForNetworkUse = true
        exporter?.videoComposition = videoComp

        exporter?.exportAsynchronously
        {
            print( "Export completed" )
        }

It seems that the videoComp.instructions[0] is a private AVCoreImageFilterVideoCompositionInstruction class. Replacing this creates an exception, and adding an additional instruction results in the export completing without actually doing anything.

It might be that what I'm trying to do is impossible, and I'll actually have to do 2 passes on the video (one for the CIFilter, the other for the CALayers). But processing to a temporary output file, then reprocessing that again in a 2 pass fashion doesn't feel right.

Does anyone know how to get this to work?

Thanks,

Ray

2

2 Answers

0
votes

1、Are you running your code on simulator? It seams animation layer can't be rendered to the video(layer's background can) on simulator.

2、If you create a AVVideoCompositionInstruction by yourself, make sure you set enablePostProcessing to YES.

0
votes

When you init AVMutableVideoComposition via init(asset: AVAsset, applyingCIFiltersWithHandler applier: @escaping (AVAsynchronousCIImageFilteringRequest) -> Void) the AVCoreImageFilterVideoCompositionInstruction is added under the hood. It has enablePostProcessing property marked as read only, and I did't find the way to set it to true.