AVCaptureDevice Camera Zoom

和自甴很熟 提交于 2019-12-03 04:21:02

问题


I have a simple AVCaptureSession running to get a camera feed in my app and take photos. How can I implement the 'pinch to zoom' functionality using a UIGestureRecognizer for the camera?


回答1:


The accepted answer is actually outdated and I'm not sure it will actually take the photo of the zoomed in image. There is a method to zoom in like bcattle answer says. The problem of his answer is that it does not take in charge the fact that the user can zoom in and then restart from that zoom position. His solution will create some kind of jumps that are not really elegant.

The easiest and most elegant way of doing this is to use the velocity of the pinch gesture.

-(void) handlePinchToZoomRecognizer:(UIPinchGestureRecognizer*)pinchRecognizer {
    const CGFloat pinchVelocityDividerFactor = 5.0f;

    if (pinchRecognizer.state == UIGestureRecognizerStateChanged) {
        NSError *error = nil;
        if ([videoDevice lockForConfiguration:&error]) {
            CGFloat desiredZoomFactor = device.videoZoomFactor + atan2f(pinchRecognizer.velocity, pinchVelocityDividerFactor);
            // Check if desiredZoomFactor fits required range from 1.0 to activeFormat.videoMaxZoomFactor
            device.videoZoomFactor = MAX(1.0, MIN(desiredZoomFactor, device.activeFormat.videoMaxZoomFactor));
            [videoDevice unlockForConfiguration];
        } else {
            NSLog(@"error: %@", error);
        }
    }
}

I found that adding the arctan function to the velocity will ease the zoom in zoom out effect a bit. It is not exactly perfect but the effect is good enough for the needs. There could probably be another function to ease the zoom out when it almost reaches 1.

NOTE: Also, the scale of a pinch gesture goes from 0 to infinite with 0 to 1 being pinching in (zoom out) and 1 to infinite being pinching out (zoom in). To get a good zoom in zoom out effect with this you'd need to have a math equation. Velocity is actually from -infinite to infinite with 0 being the starting point.

EDIT: Fixed crash on range exception. Thanks to @garafajon!




回答2:


Many have tried to do this by setting the transform property on the layer to CGAffineTransformMakeScale(gesture.scale.x, gesture.scale.y); See here for a full fledged implementation of pinch-to-zoom.




回答3:


Since iOS 7 you can set the zoom directly with the videoZoomFactor property of AVCaptureDevice.

Tie the scale property of the UIPinchGestureRecognizer to thevideoZoomFactor with a scaling constant. This will let you vary the sensitivity to taste:

-(void) handlePinchToZoomRecognizer:(UIPinchGestureRecognizer*)pinchRecognizer {
    const CGFloat pinchZoomScaleFactor = 2.0;

    if (pinchRecognizer.state == UIGestureRecognizerStateChanged) {
        NSError *error = nil;
        if ([videoDevice lockForConfiguration:&error]) {
            videoDevice.videoZoomFactor = 1.0 + pinchRecognizer.scale * pinchZoomScaleFactor;
            [videoDevice unlockForConfiguration];
        } else {
            NSLog(@"error: %@", error);
        }
    }
}

Note that AVCaptureDevice, along everything else related to AVCaptureSession, is not thread safe. So you probably don't want to do this from the main queue.




回答4:


Swift 4
Add a pinch gesture recognizer to the front-most view and connect it to this action (pinchToZoom). captureDevice should be the instance currently providing input to the capture session. pinchToZoom provides smooth zooming for both front&back capture devices.

  @IBAction func pinchToZoom(_ pinch: UIPinchGestureRecognizer) {

    guard let device = captureDevice else { return }

    func minMaxZoom(_ factor: CGFloat) -> CGFloat { return min(max(factor, 1.0), device.activeFormat.videoMaxZoomFactor) }

    func update(scale factor: CGFloat) {
      do {
        try device.lockForConfiguration()
        defer { device.unlockForConfiguration() }
        device.videoZoomFactor = factor
      } catch {
        debugPrint(error)
      } 
    }

    let newScaleFactor = minMaxZoom(pinch.scale * zoomFactor)

    switch sender.state {
      case .began: fallthrough
      case .changed: update(scale: newScaleFactor)
      case .ended:
        zoomFactor = minMaxZoom(newScaleFactor)
        update(scale: zoomFactor)
     default: break
   }
 }

It'll be useful to declare zoomFactor on your camera or vc. I usually put it on the same singleton that has AVCaptureSession. This will act as a default value for captureDevice's videoZoomFactor.

var zoomFactor: Float = 1.0



回答5:


In swift version, you can zoom in/out by simply passing scaled number on videoZoomFactor. Following code in UIPinchGestureRecognizer handler will solve the issue.

do {
    try device.lockForConfiguration()
    switch gesture.state {
    case .began:
        self.pivotPinchScale = device.videoZoomFactor
    case .changed:
        var factor = self.pivotPinchScale * gesture.scale
        factor = max(1, min(factor, device.activeFormat.videoMaxZoomFactor))
        device.videoZoomFactor = factor
    default:
        break
    }
    device.unlockForConfiguration()
} catch {
    // handle exception
}

In here, pivotPinchScale is a CGFloat property that declared in your controller somewhere.

You may also refer to following project to see how camera works with UIPinchGestureRecognizer. https://github.com/DragonCherry/CameraPreviewController




回答6:


I started from the @Gabriel Cartier's solution (thanks). In my code I've preferred to use the smoother rampToVideoZoomFactor and a simpler way to compute the device's scale factor.

(IBAction) pinchForZoom:(id) sender forEvent:(UIEvent*) event {
    UIPinchGestureRecognizer* pinchRecognizer = (UIPinchGestureRecognizer *)sender;

    static CGFloat zoomFactorBegin = .0;
    if ( UIGestureRecognizerStateBegan == pinchRecognizer.state ) {
        zoomFactorBegin = self.captureDevice.videoZoomFactor;

    } else if (UIGestureRecognizerStateChanged == pinchRecognizer.state) {
        NSError *error = nil;
        if ([self.captureDevice lockForConfiguration:&error]) {

            CGFloat desiredZoomFactor = zoomFactorBegin * pinchRecognizer.scale;
            CGFloat zoomFactor = MAX(1.0, MIN(desiredZoomFactor, self.captureDevice.activeFormat.videoMaxZoomFactor));
            [self.captureDevice rampToVideoZoomFactor:zoomFactor withRate:3.0];

            [self.captureDevice unlockForConfiguration];
        } else {
            NSLog(@"error: %@", error);
        }
    }
}



回答7:


based on @Gabriel Cartier 's answer :

- (void) cameraZoomWithPinchVelocity: (CGFloat)velocity {
    CGFloat pinchVelocityDividerFactor = 40.0f;
    if (velocity < 0) {
        pinchVelocityDividerFactor = 5.; //zoom in
    }

    if (_videoInput) {
        if([[_videoInput device] position] == AVCaptureDevicePositionBack) {
            NSError *error = nil;
            if ([[_videoInput device] lockForConfiguration:&error]) {
                CGFloat desiredZoomFactor = [_videoInput device].videoZoomFactor + atan2f(velocity, pinchVelocityDividerFactor);
                // Check if desiredZoomFactor fits required range from 1.0 to activeFormat.videoMaxZoomFactor
                CGFloat maxFactor = MIN(10, [_videoInput device].activeFormat.videoMaxZoomFactor);
                [_videoInput device].videoZoomFactor = MAX(1.0, MIN(desiredZoomFactor, maxFactor));
                [[_videoInput device] unlockForConfiguration];
            } else {
                NSLog(@"cameraZoomWithPinchVelocity error: %@", error);
            }
        }
    }
}



回答8:


There is an easier way to handle camera zoom level with pinch recognizer. The only thing you need to do is take cameraDevice.videoZoomFactor and set it to the recognizer on .began state like this

@objc private func viewPinched(recognizer: UIPinchGestureRecognizer) {
    switch recognizer.state {
        case .began:
            recognizer.scale = cameraDevice.videoZoomFactor
        case .changed:
            let scale = recognizer.scale
            do {
                 try cameraDevice.lockForConfiguration()
                 cameraDevice.videoZoomFactor = max(cameraDevice.minAvailableVideoZoomFactor, min(scale, cameraDevice.maxAvailableVideoZoomFactor))
                 cameraDevice.unlockForConfiguration()
            }
            catch {
                print(error)
            }
        default:
            break
    }
}



回答9:


I am using iOS SDK 8.3 and the AVfoundation framework and for me using the following method worked for :

nameOfAVCaptureVideoPreviewLayer.affineTransform = CGAffineTransformMakeScale(scaleX, scaleY) 

For saving the picture with the same scale I used the following method:

nameOfAVCaptureConnection.videoScaleAndCropFactor = factorNumber; 

The code bellow is for getting the image in the scale

[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
        if(imageDataSampleBuffer != NULL){

            NSData *imageData = [AVCaptureStillImageOutput  jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
            UIImage *image = [UIImage imageWithData:imageData];
}
}];


来源:https://stackoverflow.com/questions/10220958/avcapturedevice-camera-zoom

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!