22
votes

Until iOS7 update I was using...

UIImage *image = [moviePlayer thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];

...with great success, so that my app could show a still of the video that the user had just taken.

I understand this method, as of iOS7 has now deprecated and I need an alternative. I see there's a method of

- (void)requestThumbnailImagesAtTimes:(NSArray *)playbackTimes timeOption:(MPMovieTimeOption)option

though how do I return the image from it so I can place it within the videoReview button image?

Thanks in advance, Jim.

****Edited question, after trying notification centre method***

I used the following code -

[moviePlayer requestThumbnailImagesAtTimes:times timeOption:MPMovieTimeOptionNearestKeyFrame];

[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification::) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];

I made the NSArray times of two NSNumber objects 1 & 2.

I then tried to capture the notification in the following method

-(void)MPMoviePlayerThumbnailImageRequestDidFinishNotification: (NSDictionary*)info{

UIImage *image = [info objectForKey:MPMoviePlayerThumbnailImageKey];

Then proceeded to use this thumbnail image as the button image as a preview.... but it didn't work.

If you can see from my coding where I've went wrong your help would be appreciated again. Cheers

7

7 Answers

42
votes

Managed to find a great way using AVAssetImageGenerator, please see code below...

AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:partOneUrl options:nil];
AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generate1.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 2);
CGImageRef oneRef = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err];
UIImage *one = [[UIImage alloc] initWithCGImage:oneRef];
[_firstImage setImage:one];
_firstImage.contentMode = UIViewContentModeScaleAspectFit;

Within header file, please import

#import <AVFoundation/AVFoundation.h>

It works perfect and I've been able to call it from viewDidLoad, which was quicker than calling the deprecated thumbNailImageAtTime: from the viewDidAppear.

Hope this helps anyone else who had the same problem.

* **Update for Swift 5.1 ****

Useful function...

    func createThumbnailOfVideoUrl(url: URL) -> UIImage? {
        let asset = AVAsset(url: url)
        let assetImgGenerate = AVAssetImageGenerator(asset: asset)
        assetImgGenerate.appliesPreferredTrackTransform = true

        let time = CMTimeMakeWithSeconds(1.0, preferredTimescale: 600)
        do {
            let img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil) 
            let thumbnail = UIImage(cgImage: img)
            return thumbnail
        } catch {
          print(error.localizedDescription)
          return nil
        }
    }
5
votes

The requestThumbnailImagesAtTimes:timeOption: method will post a MPMoviePlayerThumbnailImageRequestDidFinishNotification notification when an image request completes. Your code that needs the thumbnail image should subscribe to this notification using NSNotificationCenter, and use the image when it receives the notification.

2
votes

The problem is that you have to specify float values in requestThumbnailImagesAtTimes.

For example, this will work

[self.moviePlayer requestThumbnailImagesAtTimes:@[@14.f] timeOption:MPMovieTimeOptionNearestKeyFrame];

but this won't work:

[self.moviePlayer requestThumbnailImagesAtTimes:@[@14] timeOption:MPMovieTimeOptionNearestKeyFrame];
2
votes

The way to do it, at least in iOS7 is to use floats for your times

NSNumber *timeStamp = @1.f;
[moviePlayer requestThumbnailImagesAtTimes:timeStamp timeOption:MPMovieTimeOptionNearestKeyFrame];

Hope this helps

2
votes

Jeely provides a good work around but it requires an additional library that isn't necessary when the MPMoviePlayer already provides functions for this task. I noticed a syntax error in the original poster's code. The thumbnail notification handler expects an object of type NSNotification, not a dictionary object. Here's a corrected example:

-(void)MPMoviePlayerThumbnailImageRequestDidFinishNotification: (NSNotification*)note
{
    NSDictionary * userInfo = [note userInfo];
    UIImage *image = (UIImage *)[userInfo objectForKey:MPMoviePlayerThumbnailImageKey];
    if(image!=NULL)
        [thumbView setImage:image];
}
0
votes

I've just looked for a solution for this problem myself and got good help from your question. Got your code above to work with one small change, removed a colon...

Change

[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification::) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];

to

[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification:) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];

Got this to work nicely. Also, I've found that you can't call a method the rely on NotificationCenter if you're already in a notification selector. Something I tried at first - I tried calling requestThumbnailImagesAtTimes inside the notification selector for MPMoviePlayerPlaybackDidFinishNotification - something that won't work. I think because the notification won't fire.

0
votes

The code in Swift 2.1 would look like this:

do{
    let asset1 =  AVURLAsset(URL: url)
    let generate1: AVAssetImageGenerator = AVAssetImageGenerator(asset: asset1)
    generate1.appliesPreferredTrackTransform = true

    let time: CMTime = CMTimeMake(3, 1)  //TO CATCH THE THIRD SECOND OF THE VIDEO
    let oneRef: CGImageRef = try generate1.copyCGImageAtTime(time, actualTime: nil)
    let resultImage = UIImage(CGImage: oneRef)
}
catch let error as NSError{
    print(error)
}