iPhone Read UIimage (frames) from video with AVFoundation

后端 未结 5 1974
耶瑟儿~
耶瑟儿~ 2020-11-29 18:14

Sorry for my english) Looking for information about read frames from a video with iPhone i found this project, http://www.codza.com/extracting-frames-from-movies-on-iphone/c

相关标签:
5条回答
  • 2020-11-29 18:17

    You can also try AVAssetImageGenerator, specifically generateCGImagesAsynchronouslyForTimes:completionHandler.

    This SO answer has good example code.

    0 讨论(0)
  • 2020-11-29 18:18

    You're talking about using the calls for generating what Apple calls thumbnail images from videos at specific times.

    For an MPMoviePlayerController (what iOS uses to hold a video from a file or other source), there are two commands to do this. The first one generates a single thumbnail (image) from a movie at a specific point in time, and the second one generates a set of thumbnails for a time range.

    This example gets an image at 10 seconds into a movie clip, myMovie.mp4:

    MPMoviePlayerController *movie = [[MPMoviePlayerController alloc]
            initWithContentURL:[NSURL URLWithString:@"myMovie.mp4"]];
    UIImage *singleFrameImage = [movie thumbnailImageAtTime:10 
            timeOption:MPMovieTimeOptionExact];
    

    Note that this performs synchronously - i.e. the user will be forced to wait while you get the screenshot.

    The other option is to get a series of images from a movie, from an array of times:

    MPMoviePlayerController *movie = [[MPMoviePlayerController alloc]
            initWithContentURL [NSURL URLWithString:@"myMovie.mp4"]];
    NSNumber time1 = 10;
    NSNumber time2 = 11;
    NSNumber time3 = 12;
    NSArray *times = [NSArray arrayWithObjects:time1,time2,time3,nil];
    [movie requestThumbnailImagesAtTimes:times timeOption:MPMovieTimeOptionExact];
    

    This second way will trigger a notification of type MPMoviePlayerThumbnailImageRequestDidFinishNotification each time a new image is generated. You can set up an observer to monitor this and process the image - I'll leave you to work that bit out on your own!

    0 讨论(0)
  • 2020-11-29 18:21

    In Swift 4 this worked for me with some modifications, mainly changing the "at" parameter of imageGenerator.copyCGImage to a CMTime type:

    func showFrame(from file:String) {
        let file = file.components(separatedBy: ".")
        guard let path = Bundle.main.path(forResource: file[0], ofType:file[1]) else {
            debugPrint( "\(file.joined(separator: ".")) not found")
            return
        }
        let url = URL(fileURLWithPath: path)
        let image = previewImageForLocalVideo(url: url)
        let imgView = UIImageView(image: image)
        view.addSubview(imgView)
    }    
    
    func previewImageForLocalVideo(url:URL) -> UIImage? {
        let asset = AVAsset(url: url)
        let imageGenerator = AVAssetImageGenerator(asset: asset)
        imageGenerator.appliesPreferredTrackTransform = true
        let tVal = NSValue(time: CMTimeMake(12, 1)) as! CMTime
        do {
            let imageRef = try imageGenerator.copyCGImage(at: tVal, actualTime: nil)
            return UIImage(cgImage: imageRef)
        }
        catch let error as NSError
        {
            print("Image generation failed with error \(error)")
            return nil
        }
    }
    
    override func viewDidLoad() {
        super.viewDidLoad()
        showFrame(from:"video.mp4")
    }
    

    Source

    0 讨论(0)
  • 2020-11-29 18:43

    Swift 2 code to take frames with AVAssetImageGenerator:

    func previewImageForLocalVideo(url:NSURL) -> UIImage?
    {
        let asset = AVAsset(URL: url)
        let imageGenerator = AVAssetImageGenerator(asset: asset)
        imageGenerator.appliesPreferredTrackTransform = true
    
        var time = asset.duration
        //If possible - take not the first frame (it could be completely black or white on camara's videos)
        time.value = min(time.value, 2)
    
        do {
            let imageRef = try imageGenerator.copyCGImageAtTime(time, actualTime: nil)
            return UIImage(CGImage: imageRef)
        }
        catch let error as NSError
        {
            print("Image generation failed with error \(error)")
            return nil
        }
    }
    
    0 讨论(0)
  • 2020-11-29 18:44

    Here is code to get FPS images from video

    1) Import

    #import <Photos/Photos.h>
    

    2) in viewDidLoad

        videoUrl = [NSURL fileURLWithPath:[[NSBundle mainBundle]pathForResource:@"VfE_html5" ofType:@"mp4"]];
        [self createImage:5]; // 5 is frame per second (FPS) you can change FPS as per your requirement.
    

    3) Functions

    -(void)createImage:(int)withFPS {
        AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil];
        AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
        generator.requestedTimeToleranceAfter =  kCMTimeZero;
        generator.requestedTimeToleranceBefore =  kCMTimeZero;
    
        for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) *  withFPS ; i++){
            @autoreleasepool {
                CMTime time = CMTimeMake(i, withFPS);
                NSError *err;
                CMTime actualTime;
                CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
                UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
                [self savePhotoToAlbum: generatedImage]; // Saves the image on document directory and not memory
                CGImageRelease(image);
            }
        }
    }
    
    -(void)savePhotoToAlbum:(UIImage*)imageToSave {
    
        [[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
            PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:imageToSave];
        } completionHandler:^(BOOL success, NSError *error) {
            if (success) {
                NSLog(@"sucess.");
            }
            else {
                NSLog(@"fail.");
            }
        }];
    }
    
    0 讨论(0)
提交回复
热议问题