Screenshot for AVPlayer and Video

前端 未结 4 746
抹茶落季
抹茶落季 2020-11-30 00:13

I am trying to take a screenshot of an AVPlayer inside a bigger view. I want to build a testing framework only, so private APIs or any method is go

相关标签:
4条回答
  • 2020-11-30 00:54

    AVPlayer rending videos using GPU, so you cannot capture it using core graphics methods.

    However that’s possible to capture images with AVAssetImageGenerator, you need specify a CMTime.


    Update:

    Forget to take a screenshot of the entire screen. AVPlayerItemVideoOutput is my final choice, it supports video steam.

    Here is my full implementation: https://github.com/BB9z/ZFPlayer/commit/a32c7244f630e69643336b65351463e00e712c7f#diff-2d23591c151edd3536066df7c18e59deR448

    0 讨论(0)
  • 2020-11-30 01:01

    If you want to take screenshot of current screen just call following method on any action event which give you Image object.

    -(UIImage *) screenshot
    {
        UIGraphicsBeginImageContext(self.view.bounds.size);
        [self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
        UIImage *sourceImage = UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();
    
        //now we will position the image, X/Y away from top left corner to get the portion we want
        UIGraphicsBeginImageContext(sourceImage.size);
        [sourceImage drawAtPoint:CGPointMake(0, 0)];
        UIImage *croppedImage = UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();
    
        //To write image on divice.
        //UIImageWriteToSavedPhotosAlbum(croppedImage,nil, nil, nil);
    
        return croppedImage;
    }
    

    Hope this will help you.

    0 讨论(0)
  • 2020-11-30 01:07
    CGRect grabRect = CGRectMake(0,0,320,568);// size you want to take screenshot of. 
    
    if ([[UIScreen mainScreen] respondsToSelector:@selector(scale)]) {
       UIGraphicsBeginImageContextWithOptions(grabRect.size, NO, [UIScreen mainScreen].scale);
     } else {
          UIGraphicsBeginImageContext(grabRect.size);
    }
    CGContextRef ctx = UIGraphicsGetCurrentContext();
    CGContextTranslateCTM(ctx, -grabRect.origin.x, -grabRect.origin.y);
    [self.view.layer renderInContext:ctx];
    UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();
    
    0 讨论(0)
  • 2020-11-30 01:08

    I know you don't want to use the AVAssetImageGenerator but I've also researched this extensively and I believe the only solution currently is using the AVAssetImageGenerator. It's not that difficult as you say to get the right coordinate because you should be able to get the current time of your player. In my App the following code works perfectly:

    -(UIImage *)getAVPlayerScreenshot 
    {
        AVURLAsset *asset = (AVURLAsset *)self.playerItem.asset;
        AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
        imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;
        imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
        CGImageRef thumb = [imageGenerator copyCGImageAtTime:self.playerItem.currentTime
                                                  actualTime:NULL
                                                       error:NULL];
        UIImage *videoImage = [UIImage imageWithCGImage:thumb];
        CGImageRelease(thumb);
        return videoImage;
    }
    
    0 讨论(0)
提交回复
热议问题