I am using standard AVFoundation classes to capture video and show preview (http://developer.apple.com/library/ios/#qa/qa1702/_index.html)
Here is my code:
Old question but anyway may save somebody hours of frustration. It's important to set the point of interest before calling setFocusMode
, otherwise your camera will set focus to the previous focus point. Think of setFocusMode
as COMMIT. Same applies to setExposureMode
.
AVCam sample by Apple is totally wrong and broken.
Some points, I've noticed that Video presets take longer to initialise than photo preset.
Are you recording video or taking photos?
I've noticed you have a medium quality setting but with 32BGRA, it could work out better to set capture mode to Photo and downsample the image after capture. Also set AVVideoCodecJPEG instead of 32BGRA.
[device setOutputSettings:[NSDictionary dictionaryWithObject:AVVideoCodecJPEG forKey:AVVideoCodecKey]];
Instead of:
[device setOutputSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
You also might want to register for notifications to subjectAreaChangeMonitoring and force a refocus, if your changing focus mode to AVCaptureFocusModeAutoFocus at any point.
You also might want to add code to manually set autofocus and reset it to automatic as sometimes this is required.
I've amended the code to set a focus point of interest and log the camera configuration error output to a delegate method.
- (void)setupCaptureSession {
NSError *error = nil;
[self setCaptureSession: [[AVCaptureSession alloc] init]];
self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]){
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
if ([device isFocusPointOfInterestSupported])
[device setFocusPointOfInterest:CGPointMake(0.5f,0.5f)];
[device unlockForConfiguration];
}else {
if ([[self delegate]
respondsToSelector:@selector(captureManager:didFailWithError:)]) {
[[self delegate] captureManager:self didFailWithError:error];
}
}
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// TODO: Obsługa błędu, gdy nie uda się utworzyć wejścia
}
[[self captureSession] addInput:input];
AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[[self captureSession] addOutput:output];
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
output.minFrameDuration = CMTimeMake(1, 15);
[[self captureSession] startRunning];
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
captureVideoPreviewLayer.frame = previewLayer.bounds;
[previewLayer.layer insertSublayer:captureVideoPreviewLayer atIndex:0];
[previewLayer setHidden:NO];
mutex = YES;
}