问题
I am new to mobile programming. I am working in H264 video rendering in iOS application using VideoToolBox framework. It has one feature to take snapshot while rendering the video. Whenever I take a snapshot, I get the Black screen only. I tried this 1. renderInContext, 2. drawViewHierarchyInRect, 3. snapshotViewAfterScreenUpdates method to capture the rendering the video but returns a Black screen only.
//snapshot coding
UIGraphicsBeginImageContextWithOptions (self.view.bounds.size, YES, 0.0);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *snapshotImage = UIGraphicsGetImageFromCurrentImageContext();
mImageView.image = snapshotImage;
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(snapshotImage,self, @selector(image:didFinishSavingWithError: contextInfo:), nil);
回答1:
Check this out, following chunk of code works for me to take screen's snap shot
if ([[UIScreen mainScreen] respondsToSelector:@selector(scale)])
UIGraphicsBeginImageContextWithOptions(APP_DELEGATE.window.bounds.size, NO, [[UIScreen mainScreen] scale]);
else
UIGraphicsBeginImageContext(APP_DELEGATE.window.bounds.size);
[APP_DELEGATE.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * data = UIImagePNGRepresentation(image);
I guess, it will help you. let me know if so
回答2:
I've not worked with video yet, but a simple snapshot of UIView with subViews on it works fine
+ (UIImage *)makeSnapShot:(UIView *)view image:(UIImageView *)imageView
{
CGFloat offset_x = /*your_value*/;
CGFloat offset_y = /*your_value*/;
UIGraphicsBeginImageContext(view.bounds.size);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGRect rect = CGRectMake(offset_x, offset_y, imageView.bounds.size.width, imageView.bounds.size.height);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], rect);
image = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return image;
}
回答3:
Not sure if this is what you're looking for, but if you need to get a snapshot of the VTDecompressionSession, you can send the CVImageBuffer that you get from the decodeFrame callback into this method to get a UIImage. You can also add your CIContext to the parameters list instead of using the temporaryContext.
+ (UIImage *) UIImageFromCVImageBufferRef:(CVImageBufferRef)imageBuf
{
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuf];
CIContext *temporaryContext = [CIContext contextWithOptions:nil];
CGImageRef videoImage = [temporaryContext
createCGImage:ciImage
fromRect:CGRectMake(0, 0,
CVPixelBufferGetWidth(imageBuf),
CVPixelBufferGetHeight(imageBuf))];
UIImage *image = [[UIImage alloc] initWithCGImage:videoImage];
CGImageRelease(videoImage);
return image;
}
回答4:
func takeScreenshot(_ shouldSave: Bool = true) {
var screenshotImage :UIImage?
let layer = UIApplication.shared.keyWindow!.layer
let scale = UIScreen.main.scale
UIGraphicsBeginImageContextWithOptions(layer.frame.size, false, scale)
self.view.drawHierarchy(in: self.view.bounds, afterScreenUpdates: true)
screenshotImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
if let image = screenshotImage, shouldSave {
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
}
}
来源:https://stackoverflow.com/questions/27646689/snap-shot-is-not-working