2
votes

I am using the below code, for streaming the two videos sequentially. But it is not showing any video in the simulator, its totally blank.

Also how can I seek through these two videos. Like, if one video is of 2 minutes and the second is 3 minutes. Now I need to get the total time of these videos and seek through them. When I slide the slider bar to 4 minutes so the 2nd video should be played from minute 2 to onward.

Is it possible?

- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.

    NSURL *url1 = [NSURL URLWithString:@"http://www.tools4movies.com/dvd_catalyst_profile_samples/Harold%20Kumar%203%20Christmas%20bionic.mp4"];
    NSURL *url2 = [NSURL URLWithString:@"http://www.tools4movies.com/dvd_catalyst_profile_samples/Harold%20Kumar%203%20Christmas%20tablet.mp4"];

    NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];

    AVMutableComposition *composition = [[AVMutableComposition alloc] init];

    asset1 = [[AVURLAsset alloc] initWithURL:url1 options:options];
    AVURLAsset * asset2 = [[AVURLAsset alloc]initWithURL:url2 options:options];

    CMTime insertionPoint = kCMTimeZero;
    NSError * error = nil;
    composition = [AVMutableComposition composition];

    if (![composition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset1.duration) 
                              ofAsset:asset1 
                               atTime:insertionPoint 
                                error:&error]) 
    {
        NSLog(@"error: %@",error);
    }

    insertionPoint = CMTimeAdd(insertionPoint, asset1.duration);

    if (![composition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset2.duration) 
                              ofAsset:asset2 
                               atTime:insertionPoint 
                                error:&error]) 
    {
        NSLog(@"error: %@",error);
    }

    AVPlayerItem * item = [[AVPlayerItem alloc] initWithAsset:composition];
    player = [AVPlayer playerWithPlayerItem:item];
    AVPlayerLayer * layer = [AVPlayerLayer playerLayerWithPlayer:player];

    [layer setFrame:CGRectMake(0, 0, 320, 480)];
    [[[self view] layer] addSublayer:layer];
    [player play];   
}

Can anyone tell me that what is the error in my code?

1
Have you tested this code on the device?Sam
See the answer to this question: stackoverflow.com/questions/8318422/…djromero
@Sam No I haven't check it on real deviceOmer Waqas Khan
@madmw then is there any other way to achieve the above scenario?Omer Waqas Khan
AVQueuePlayer? Two AVPlayers? Downloading the files first? There are options. You need to know the duration of each video and make some calculations before deciding which video to play and which time to seek.djromero

1 Answers

3
votes

The simulator is NOT ABLE to display video. Nether the inbuilt UIImagePickerController nor any video controller will work. It's not implemented and mostly appears black or red on the iOS simulator. You have to debug on the iOS target. Sometimes debugging will not work properly. Use NSLog() istead. This will always work (i.e. if you compile without debug informations using 'release' code)

you can seek using the player:

if mp is your media player:

[mp pause];
CMTime position = mp.currentTime;

// maybe replace something
[mp replaceCurrentItemWithPlayerItem:[AVPlayerItem playerItemWithAsset:self.composition]];

[mp seekToTime:length];
[mp play];

summary:
Edit: use composition and player item
Seek: use player

Here is a short formal example of how to do this (and already thread safe):

AVMutableComposition *_composition = [AVMutableComposition composition];

// iterate though all files
// And build mutable composition
for (int i = 0; i < filesCount; i++) {

    AVURLAsset* sourceAsset = nil;

    NSURL* movieURL = [NSURL fileURLWithPath:[paths objectAtIndex:i]];
    sourceAsset = [AVURLAsset URLAssetWithURL:movieURL options:nil];

    // calculate time
    CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), sourceAsset.duration);

    NSError *editError;
    BOOL result = [_composition insertTimeRange:editRange
                                        ofAsset:sourceAsset
                                        atTime:_composition.duration
                                        error:&editError];

    dispatch_sync(dispatch_get_main_queue(), ^{

        // maybe you need a progress bar
        self.loaderBar.progress = (float) i / filesCount;
        [self.loaderBar setNeedsDisplay];
     });

}

// make the composition threadsafe if you need it later
self.composition = [[_composition copy] autorelease];

// Player wants mainthread?    
dispatch_sync(dispatch_get_main_queue(), ^{

    mp = [AVPlayer playerWithPlayerItem:[[[AVPlayerItem alloc] initWithAsset:self.composition] autorelease]];

    self.observer = [mp addPeriodicTimeObserverForInterval:CMTimeMake(60, 600) queue:nil usingBlock:^(CMTime time){

        // this is our callback block to set the progressbar
        if (mp.status == AVPlayerStatusReadyToPlay) {

            float actualTime = time.value / time.timescale;

            // avoid division by zero
            if (time.value > 0.) {

                CMTime length = mp.currentItem.asset.duration;
                float lengthTime = length.value / length.timescale;

                if (lengthTime) {

                    self.progressBar.value = actualTime / lengthTime;
                } else {

                        self.progressBar.value = 0.0f;    
                }
            }];
        });

        // the last task must be on mainthread again
        dispatch_sync(dispatch_get_main_queue(), ^{

            // create our playerLayer
            self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:mp];
            self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;  
            self.playerLayer.frame = [self view].layer.bounds;

            // insert into our view (make it visible)
            [[self view].layer insertSublayer:self.playerLayer atIndex:0];
        });

    // and now do the playback, maybe mp is global (self.mp)
    // this depends on your needs
    [mp play];
});

I hope this helps.