Grabbing the first frame of a video from UIImagePickerController?

前端 未结 1 876
没有蜡笔的小新
没有蜡笔的小新 2021-02-01 08:47

I\'m trying to get the first frame from the selected video in a UIImagePickerController to show in a UIImageView, but I do not know if it\'s possible.

相关标签:
1条回答
  • 2021-02-01 09:40

    You can do this in one of two ways. The first way is to use the MPMoviePlayerController to grab the thumbnail:

    MPMoviePlayerController *moviePlayer = [[MPMoviePlayerController alloc]
                                           initWithContentURL:videoURL];
    moviePlayer.shouldAutoplay = NO;
    UIImage *thumbnail = [moviePlayer thumbnailImageAtTime:time
                         timeOption:MPMovieTimeOptionNearestKeyFrame];
    

    This works, but MPMoviePlayerController is not a particularly lightweight object and not particularly fast grabbing thumbnails.

    The preferred way is to use the new AVAssetImageGenerator in AVFoundation. This is fast, lightweight and more flexible than the old way. Here's a helper method that will return an autoreleased image from the video.

    
    + (UIImage *)thumbnailImageForVideo:(NSURL *)videoURL 
                                 atTime:(NSTimeInterval)time 
    {
    
        AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
        NSParameterAssert(asset);
        AVAssetImageGenerator *assetIG = 
                    [[AVAssetImageGenerator alloc] initWithAsset:asset];
        assetIG.appliesPreferredTrackTransform = YES;
        assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;
    
        CGImageRef thumbnailImageRef = NULL;
        CFTimeInterval thumbnailImageTime = time;
        NSError *igError = nil;
        thumbnailImageRef = 
                 [assetIG copyCGImageAtTime:CMTimeMake(thumbnailImageTime, 60)
                                 actualTime:NULL
                                      error:&igError];
    
        if (!thumbnailImageRef)
            NSLog(@"thumbnailImageGenerationError %@", igError );
    
        UIImage *thumbnailImage = thumbnailImageRef 
                              ? [[UIImage alloc] initWithCGImage:thumbnailImageRef]
                              : nil;
    
        return thumbnailImage;
    }
    

    Asynchronous usage

    
    - (void)thumbnailImageForVideo:(NSURL *)videoURL atTime:(NSTimeInterval)time completion:(void (^)(UIImage *)) completion
    {
        dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
    
            AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
            NSParameterAssert(asset);
            AVAssetImageGenerator *assetIG =
            [[AVAssetImageGenerator alloc] initWithAsset:asset];
            assetIG.appliesPreferredTrackTransform = YES;
            assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;
    
            CGImageRef thumbnailImageRef = NULL;
            CFTimeInterval thumbnailImageTime = time;
            NSError *igError = nil;
            thumbnailImageRef =
            [assetIG copyCGImageAtTime:CMTimeMake(thumbnailImageTime, 60)
                            actualTime:NULL
                                 error:&igError];
    
            if (!thumbnailImageRef)
                NSLog(@"thumbnailImageGenerationError %@", igError );
    
            UIImage *thumbnailImage = thumbnailImageRef
            ? [[UIImage alloc] initWithCGImage:thumbnailImageRef]
            : nil;
    
            dispatch_async(dispatch_get_main_queue(), ^{
                completion(thumbnailImage);
            });
        });
    }
    
    0 讨论(0)
提交回复
热议问题