0
votes

I am attempting to stitch together video assets using AVComposition based on the code here: https://developer.apple.com/library/mac/samplecode/AVCompositionDebugViewer/Introduction/Intro.html

On OSX it works perfectly, however on iOS when playing back via AVPlayer it only works with 1 or 2 input clips. If I attempt to add a third there will be nothing played back on the AVPlayerLayer. Wierdly if I observer the AVPlayer playback time using addPeriodicTimeObserverForInterval the video appears to be playing for the correct duration, but nothing plays back on the layer. Does anyone have any insight into why this would be?

1

1 Answers

0
votes

Turns out I was creating CMTime objects with differing timeScale values, which was causing rounding errors and creating gaps in my tracks. If a track had a gap then it would just fail to play. Ensuring that all my CMTime objects had the same timeScale made everything work perfectly.