Getting only white screenshot

北城以北 提交于 2019-12-02 11:39:10

问题


I can read the barcode but I can't get the snapshot of the screen. getScreenImage function gets a white screen. How can I get the screenshot including the screen which I see the camera view? Thank you.

@interface igViewController () <AVCaptureMetadataOutputObjectsDelegate,AVCaptureVideoDataOutputSampleBufferDelegate>
{
    AVCaptureSession *_session;
    AVCaptureDevice *_device;
    AVCaptureDeviceInput *_input;
    AVCaptureMetadataOutput *_output;
    AVCaptureVideoPreviewLayer *_prevLayer;

    UIView *_highlightView;
    UILabel *_label;
    UIImage *img;

}
@end
- (void)viewDidLoad
{
    [super viewDidLoad];

    _highlightView = [[UIView alloc] init];
    _highlightView.autoresizingMask = UIViewAutoresizingFlexibleTopMargin|UIViewAutoresizingFlexibleLeftMargin|UIViewAutoresizingFlexibleRightMargin|UIViewAutoresizingFlexibleBottomMargin;
    _highlightView.layer.borderColor = [UIColor greenColor].CGColor;
    _highlightView.layer.borderWidth = 3;
    [self.view addSubview:_highlightView];

    _label = [[UILabel alloc] init];
    _label.frame = CGRectMake(0, self.view.bounds.size.height -100, self.view.bounds.size.width, 100);
    _label.autoresizingMask = UIViewAutoresizingFlexibleTopMargin;
    _label.backgroundColor = [UIColor colorWithWhite:0.15 alpha:0.65];
    _label.textColor = [UIColor whiteColor];
    _label.textAlignment = NSTextAlignmentCenter;


    //[_label addObserver:self forKeyPath:@"text" options:NSKeyValueObservingOptionNew context:NULL];
    [self.view addSubview:_label];

    _session = [[AVCaptureSession alloc] init];
    _device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    NSError *error = nil;

    _input = [AVCaptureDeviceInput deviceInputWithDevice:_device error:&error];
    if (_input) {
        [_session addInput:_input];
    } else {
        NSLog(@"Error: %@", error);
    }

    _output = [[AVCaptureMetadataOutput alloc] init];
    [_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
    [_session addOutput:_output];

    _output.metadataObjectTypes = [_output availableMetadataObjectTypes];

    _prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
    _prevLayer.frame = self.view.bounds;
    _prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [self.view.layer addSublayer:_prevLayer];
    [_session startRunning];

    [self.view bringSubviewToFront:_highlightView];
    [self.view bringSubviewToFront:_label];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
{
    CGRect highlightViewRect = CGRectZero;
    AVMetadataMachineReadableCodeObject *barCodeObject;
    detectionString = nil;
    NSArray *barCodeTypes = @[AVMetadataObjectTypeUPCECode, AVMetadataObjectTypeCode39Code, AVMetadataObjectTypeCode39Mod43Code,
            AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeCode93Code, AVMetadataObjectTypeCode128Code,
            AVMetadataObjectTypePDF417Code, AVMetadataObjectTypeQRCode, AVMetadataObjectTypeAztecCode];

    for (AVMetadataObject *metadata in metadataObjects) {

        for (NSString *type in barCodeTypes) {
            if ([metadata.type isEqualToString:type])
            {
                barCodeObject = (AVMetadataMachineReadableCodeObject *)[_prevLayer transformedMetadataObjectForMetadataObject:(AVMetadataMachineReadableCodeObject *)metadata];
                highlightViewRect = barCodeObject.bounds;
                detectionString = [(AVMetadataMachineReadableCodeObject *)metadata stringValue];
                break;
            }
        }

        _highlightView.frame = highlightViewRect;
        if (detectionString != nil)
        {    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
            [self getScreenImage];
            _label.text = detectionString;
            [self performSelector:@selector(changeText) withObject:nil afterDelay:2.0];
            break;
        }
    }

    if(detectionString!=nil){
        [MainViewController setResultTexts:detectionString img:img];
        detectionString=nil;
        [self dismissViewControllerAnimated:YES completion:nil];
        [_session stopRunning];

    }
}
CGImageRef UIGetScreenImage(void);
-(void)getScreenImage{
    if(detectionString!=nil){
    CGImageRef screen = UIGetScreenImage();
    img = [UIImage imageWithCGImage:screen];
    CGImageRelease(screen);
    }
}

Edited:

This works. But need to make output change for session more quickly. Because the session changing to capture an image. So the screen is getting disappear for a second. And I only get screen with %50 opacity. This link is where I get help. How can I get a 100% opacity screenshot of it now?

_highlightView.frame = highlightViewRect;
        if (detectionString != nil)
        {
            **[_session removeOutput:_output];
            [_session addOutput:_stillImageOutput];**
            _label.text = detectionString;
            **[self captureNow];**
            [self performSelector:@selector(changeText) withObject:nil afterDelay:1.0];
            break;
        }

-(void)captureNow {
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in _stillImageOutput.connections)
    {
        for (AVCaptureInputPort *port in [connection inputPorts])
        {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] )
            {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection)
        {
            break;
        }
    }

    NSLog(@"about to request a capture from: %@", _stillImageOutput);
    [_stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
     {
         NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
         img = [[UIImage alloc] initWithData:imageData];
     }];

}

回答1:


I used AVCaptureStillImageOutput and it worked.

    -(void)captureNow {
        AVCaptureConnection *videoConnection = nil;
        for (AVCaptureConnection *connection in _stillImageOutput.connections)
        {
            for (AVCaptureInputPort *port in [connection inputPorts])
            {
                if ([[port mediaType] isEqual:AVMediaTypeVideo] )
                {
                    videoConnection = connection;
                    break;
                }
            }
            if (videoConnection)
            {
                break;
            }
        }


        [_stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
         {
             NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
             img = [[UIImage alloc] initWithData:imageData];
//I used image in another viewcontroller
             [MainViewController setResultTexts:str img:img];
             [MainViewController set_from_view:0 scanner:1];
             [self dismissViewControllerAnimated:YES completion:nil];

         }];

    }



回答2:


Try this here contentScrollview is my scrollview i'm getting all the contents in scrollview as a screen shot you an replace it with your view object

- (UIImage *) imageFromViewIniOS7
{
UIImage* image = nil;
UIGraphicsBeginImageContext(contentScrollview.contentSize);
{
    CGPoint savedContentOffset = contentScrollview.contentOffset;
    CGRect savedFrame = contentScrollview.frame;

    contentScrollview.contentOffset = CGPointZero;
    contentScrollview.frame = CGRectMake(0, 0, contentScrollview.contentSize.width, contentScrollview.contentSize.height);
    if ([[NSString versionofiOS] intValue]>=7)
    {
        [contentScrollview drawViewHierarchyInRect:contentScrollview.bounds afterScreenUpdates:YES];

    }
    else
    {
        [contentScrollview.layer renderInContext: UIGraphicsGetCurrentContext()];

    }
    image = UIGraphicsGetImageFromCurrentImageContext();

    contentScrollview.contentOffset = savedContentOffset;
    contentScrollview.frame = savedFrame;
}
UIGraphicsEndImageContext();


return image;
}


来源:https://stackoverflow.com/questions/22397904/getting-only-white-screenshot

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!