avcapturedevice

iPhone Camera Focussing

蓝咒 提交于 2019-12-05 02:25:43
问题 I used the below code for focusing the iphone camera. But it is not working. I take this code from the AVCam sample code of Apple. Am I doing anything wrong? Is there any method to detect if the iPhone did focussing? -(void) focusAtPoint:(CGPoint)point { AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];; if (device != nil) { if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) { NSError *error; if (

AVCaptureSession rotate | orientation while video transmitting

无人久伴 提交于 2019-12-05 01:43:21
问题 I am developing video streaming application, in which i need to capture front camera video frame and encode then transfer to other end, a typical flow is like this AVCaptureSession -> AVCaptureDeviceInput -> AVCaptureVideoDataOutput -> capture frame --> encode frame --> send frame to other end, it works fine, i have setup the kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange as a frame format. also preview layer being used to show the preview, the problem comes when device orientation gets

iPhone 4 AVFoundation : Capture from front and rear cameras simultaneously

别说谁变了你拦得住时间么 提交于 2019-12-03 20:05:23
问题 I was wondering if it was possible to capture from both cameras simultaneously using AVFoundation framework. Specifically, my question is whether both front and rear AVCaptureDevices can be active at the same time or not. Currently I know that an AVCaptureSession instance can support only one input (and output). I create two AVCaptureSessions, attach front camera device to one and rear to other, I then point the outputs of the sessions to different SampleBufferDelegate functions. What I see

iPhone Camera Focussing

僤鯓⒐⒋嵵緔 提交于 2019-12-03 17:13:14
I used the below code for focusing the iphone camera. But it is not working. I take this code from the AVCam sample code of Apple. Am I doing anything wrong? Is there any method to detect if the iPhone did focussing? -(void) focusAtPoint:(CGPoint)point { AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];; if (device != nil) { if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) { NSError *error; if ([device lockForConfiguration:&error]) { [device setFocusPointOfInterest:point]; [device setFocusMode

AVCaptureSession rotate | orientation while video transmitting

北战南征 提交于 2019-12-03 16:49:26
I am developing video streaming application, in which i need to capture front camera video frame and encode then transfer to other end, a typical flow is like this AVCaptureSession -> AVCaptureDeviceInput -> AVCaptureVideoDataOutput -> capture frame --> encode frame --> send frame to other end, it works fine, i have setup the kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange as a frame format. also preview layer being used to show the preview, the problem comes when device orientation gets changes, if device moved from portrait to landscape, then on the other end frames gets rotate by 90, i was

Red audio recording status bar “flashes” while app in *foreground*

和自甴很熟 提交于 2019-12-03 15:00:59
问题 There are many questions (here, here) regarding the double height red audio recording status bar, but all of them reference flashes when the app resigns into the background. I'm getting a flash, I'm assuming from an AVCaptureSession setup, while the app is in foreground. Has anyone experienced this before? 回答1: You have to remove the audio input from the AVCaptureSession [self.captureSession removeInput:audioIn]; in which the audioIn is the AVCaptureDeviceInput object, that is initialised in

iOS 5 - AVCaptureDevice setting focus point and focus mode freezes the live camera picture

雨燕双飞 提交于 2019-12-03 10:17:02
问题 I'm using the following method to set point of focus since iOS 4: - (void) focusAtPoint:(CGPoint)point { AVCaptureDevice *device = [[self captureInput] device]; NSError *error; if ([device isFocusModeSupported:AVCaptureFocusModeAutoFocus] && [device isFocusPointOfInterestSupported]) { if ([device lockForConfiguration:&error]) { [device setFocusPointOfInterest:point]; [device setFocusMode:AVCaptureFocusModeAutoFocus]; [device unlockForConfiguration]; } else { NSLog(@"Error: %@", error); } } }

Real time face detection with Camera on swift 3

一世执手 提交于 2019-12-03 09:11:27
How can I do face detection in realtime just as "Camera" does? like white round shape around and over the face. I use AVCapturSession . I found that the image I saved for facial detection. Below I have attached my current code. it only captures image when I press the button and save it into the photo gallery. some please help me to create real-time round shape over according to the person's face! code class CameraFaceRecongnitionVC: UIViewController { @IBOutlet weak var imgOverlay: UIImageView! @IBOutlet weak var btnCapture: UIButton! let captureSession = AVCaptureSession() let

iOS 5 - AVCaptureDevice setting focus point and focus mode freezes the live camera picture

喜欢而已 提交于 2019-12-03 00:45:29
I'm using the following method to set point of focus since iOS 4: - (void) focusAtPoint:(CGPoint)point { AVCaptureDevice *device = [[self captureInput] device]; NSError *error; if ([device isFocusModeSupported:AVCaptureFocusModeAutoFocus] && [device isFocusPointOfInterestSupported]) { if ([device lockForConfiguration:&error]) { [device setFocusPointOfInterest:point]; [device setFocusMode:AVCaptureFocusModeAutoFocus]; [device unlockForConfiguration]; } else { NSLog(@"Error: %@", error); } } } On iOS 4 devices this works without any problems. But on iOS 5 the live camera feed freezes and after

Show camera stream while AVCaptureSession's running

℡╲_俬逩灬. 提交于 2019-12-02 19:39:14
I was able to capture video frames from the camera using AVCaptureSession according to http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html . However, it seems that AVCaptureScreen captures frames from the camera without showing the camera stream on the screen. I would like to also show camera stream just like in UIImagePicker so that the user knows that the camera is being turned on and sees what the camera is pointed at. Any help or pointer would be appreciated! AVCaptureVideoPreviewLayer is exactly what you're looking for. The code fragment Apple uses to demonstrate how to use it