iOS: Capture image from front facing camera

前端 未结 5 2020
耶瑟儿~
耶瑟儿~ 2020-12-01 00:56

I am making an application where I would like to capture an image from the front facing camera, without presenting a capture screen of any kind. I want to take a picture com

相关标签:
5条回答
  • 2020-12-01 00:57

    There is a method called takePicture in the docs for UIImagePickerController class. It says:

    Use this method in conjunction with a custom overlay view to initiate the programmatic capture of a still image. This supports taking more than one picture without leaving the interface, but requires that you hide the default image picker controls.

    0 讨论(0)
  • 2020-12-01 01:08

    How to capture an image using the AVFoundation front-facing camera:

    Development Caveats:

    • Check your app and image orientation settings carefully
    • AVFoundation and its associated frameworks are nasty behemoths and very difficult to understand/implement. I've made my code as lean as possible, but please check out this excellent tutorial for a better explanation (website not available any more, link via archive.org): http://www.benjaminloulier.com/posts/ios4-and-direct-access-to-the-camera

    ViewController.h

    // Frameworks
    #import <CoreVideo/CoreVideo.h>
    #import <CoreMedia/CoreMedia.h>
    #import <AVFoundation/AVFoundation.h>
    #import <UIKit/UIKit.h>
    
    @interface CameraViewController : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate>
    
    // Camera
    @property (weak, nonatomic) IBOutlet UIImageView* cameraImageView;
    @property (strong, nonatomic) AVCaptureDevice* device;
    @property (strong, nonatomic) AVCaptureSession* captureSession;
    @property (strong, nonatomic) AVCaptureVideoPreviewLayer* previewLayer;
    @property (strong, nonatomic) UIImage* cameraImage;
    
    @end
    

    ViewController.m

    #import "CameraViewController.h"
    
    @implementation CameraViewController
    
    - (void)viewDidLoad
    {
        [super viewDidLoad];
    
        [self setupCamera];
        [self setupTimer];
    }
    
    - (void)setupCamera
    {    
        NSArray* devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
        for(AVCaptureDevice *device in devices)
        {
            if([device position] == AVCaptureDevicePositionFront)
                self.device = device;
        }
    
        AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];
        AVCaptureVideoDataOutput* output = [[AVCaptureVideoDataOutput alloc] init];
        output.alwaysDiscardsLateVideoFrames = YES;
    
        dispatch_queue_t queue;
        queue = dispatch_queue_create("cameraQueue", NULL);
        [output setSampleBufferDelegate:self queue:queue];
    
        NSString* key = (NSString *) kCVPixelBufferPixelFormatTypeKey;
        NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
        NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
        [output setVideoSettings:videoSettings];
    
        self.captureSession = [[AVCaptureSession alloc] init];
        [self.captureSession addInput:input];
        [self.captureSession addOutput:output];
        [self.captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
    
        self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
        self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    
        // CHECK FOR YOUR APP
        self.previewLayer.frame = CGRectMake(0, 0, self.view.frame.size.height, self.view.frame.size.width);
        self.previewLayer.orientation = AVCaptureVideoOrientationLandscapeRight;
        // CHECK FOR YOUR APP
    
        [self.view.layer insertSublayer:self.previewLayer atIndex:0];   // Comment-out to hide preview layer
    
        [self.captureSession startRunning];
    }
    
    - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
    {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        CVPixelBufferLockBaseAddress(imageBuffer,0);
        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
    
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
        CGImageRef newImage = CGBitmapContextCreateImage(newContext);
    
        CGContextRelease(newContext);
        CGColorSpaceRelease(colorSpace);
    
        self.cameraImage = [UIImage imageWithCGImage:newImage scale:1.0f orientation:UIImageOrientationDownMirrored];
    
        CGImageRelease(newImage);
    
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    }
    
    - (void)setupTimer
    {
        NSTimer* cameraTimer = [NSTimer scheduledTimerWithTimeInterval:2.0f target:self selector:@selector(snapshot) userInfo:nil repeats:YES];
    }
    
    - (void)snapshot
    {
        NSLog(@"SNAPSHOT");
        self.cameraImageView.image = self.cameraImage;  // Comment-out to hide snapshot
    }
    
    @end
    

    Connect this up to a UIViewController with a UIImageView for the snapshot and it'll work! Snapshots are taken programmatically at 2.0 second intervals without any user input. Comment out the selected lines to remove the preview layer and snapshot feedback.

    Any more questions/comments, please let me know!

    0 讨论(0)
  • 2020-12-01 01:15

    Convert above code to Swift 4

    import UIKit
    import AVFoundation
    
    class CameraViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {
    
    @IBOutlet weak var cameraImageView: UIImageView!
    
    var device: AVCaptureDevice?
    var captureSession: AVCaptureSession?
    var previewLayer: AVCaptureVideoPreviewLayer?
    var cameraImage: UIImage?
    
    override func viewDidLoad() {
        super.viewDidLoad()
    
        setupCamera()
        setupTimer()
    }
    
    func setupCamera() {
        let discoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera],
                                                                mediaType: AVMediaType.video,
                                                               position: .front)
        device = discoverySession.devices[0]
    
        let input: AVCaptureDeviceInput
        do {
            input = try AVCaptureDeviceInput(device: device!)
        } catch {
            return
        }
    
        let output = AVCaptureVideoDataOutput()
        output.alwaysDiscardsLateVideoFrames = true
    
        let queue = DispatchQueue(label: "cameraQueue")
        output.setSampleBufferDelegate(self, queue: queue)
        output.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable as! String: kCVPixelFormatType_32BGRA]
    
        captureSession = AVCaptureSession()
        captureSession?.addInput(input)
        captureSession?.addOutput(output)
        captureSession?.sessionPreset = AVCaptureSession.Preset.photo
    
        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession!)
        previewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
        previewLayer?.frame = CGRect(x: 0.0, y: 0.0, width: view.frame.width, height: view.frame.height)
    
        view.layer.insertSublayer(previewLayer!, at: 0)
    
            captureSession?.startRunning()
        }
    
        func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
            let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
        CVPixelBufferLockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: 0))
            let baseAddress = UnsafeMutableRawPointer(CVPixelBufferGetBaseAddress(imageBuffer!))
            let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!)
            let width = CVPixelBufferGetWidth(imageBuffer!)
            let height = CVPixelBufferGetHeight(imageBuffer!)
    
            let colorSpace = CGColorSpaceCreateDeviceRGB()
            let newContext = CGContext(data: baseAddress, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo:
            CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)
    
            let newImage = newContext!.makeImage()
             cameraImage = UIImage(cgImage: newImage!)
    
            CVPixelBufferUnlockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: 0))
        }
    
        func setupTimer() {
            _ = Timer.scheduledTimer(timeInterval: 2.0, target: self, selector: #selector(snapshot), userInfo: nil, repeats: true)
        }
    
        @objc func snapshot() {
            print("SNAPSHOT")
            cameraImageView.image = cameraImage
        }
    }
    
    0 讨论(0)
  • 2020-12-01 01:17

    You probably need to use AVFoundation to capture the video stream/image without displaying it. Unlike UIImagePickerController, it doesn't work 'out-of-the-box'. Look at Apple's AVCam as an example to get you started.

    0 讨论(0)
  • 2020-12-01 01:18

    I converted the code above from Objc to Swift 3, if anyone still looks for a solution in 2017.

    import UIKit
    import AVFoundation
    
    class CameraViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {
    
    @IBOutlet weak var cameraImageView: UIImageView!
    
    var device: AVCaptureDevice?
    var captureSession: AVCaptureSession?
    var previewLayer: AVCaptureVideoPreviewLayer?
    var cameraImage: UIImage?
    
    override func viewDidLoad() {
        super.viewDidLoad()
    
        setupCamera()
        setupTimer()
    }
    
    func setupCamera() {
        let discoverySession = AVCaptureDeviceDiscoverySession(deviceTypes: [.builtInWideAngleCamera],
                                                               mediaType: AVMediaTypeVideo,
                                                               position: .front)
        device = discoverySession?.devices[0]
    
        let input: AVCaptureDeviceInput
        do {
            input = try AVCaptureDeviceInput(device: device)
        } catch {
            return
        }
    
        let output = AVCaptureVideoDataOutput()
        output.alwaysDiscardsLateVideoFrames = true
    
        let queue = DispatchQueue(label: "cameraQueue")
        output.setSampleBufferDelegate(self, queue: queue)
        output.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable: kCVPixelFormatType_32BGRA]
    
        captureSession = AVCaptureSession()
        captureSession?.addInput(input)
        captureSession?.addOutput(output)
        captureSession?.sessionPreset = AVCaptureSessionPresetPhoto
    
        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        previewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
    
        previewLayer?.frame = CGRect(x: 0.0, y: 0.0, width: view.frame.width, height: view.frame.height)
    
        view.layer.insertSublayer(previewLayer!, at: 0)
    
        captureSession?.startRunning()
    }
    
    func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
        let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
        CVPixelBufferLockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: .allZeros))
        let baseAddress = UnsafeMutableRawPointer(CVPixelBufferGetBaseAddress(imageBuffer!))
        let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!)
        let width = CVPixelBufferGetWidth(imageBuffer!)
        let height = CVPixelBufferGetHeight(imageBuffer!)
    
        let colorSpace = CGColorSpaceCreateDeviceRGB()
        let newContext = CGContext(data: baseAddress, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo:
            CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)
    
        let newImage = newContext!.makeImage()
        cameraImage = UIImage(cgImage: newImage!)
    
        CVPixelBufferUnlockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: .allZeros))
    }
    
    func setupTimer() {
        _ = Timer.scheduledTimer(timeInterval: 2.0, target: self, selector: #selector(snapshot), userInfo: nil, repeats: true)
    }
    
    func snapshot() {
        print("SNAPSHOT")
        cameraImageView.image = cameraImage
    }
    }
    

    Also, I found a shorter solution for getting the image from the CMSampleBuffer:

    func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
        let myPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
        let myCIimage = CIImage(cvPixelBuffer: myPixelBuffer!)
        let videoImage = UIImage(ciImage: myCIimage)
        cameraImage = videoImage
    }
    
    0 讨论(0)
提交回复
热议问题