Is there a way to take a picture in code on the iPhone without going through the Apple controls? I have seen a bunch of apps that do this, but I\'m not sure what API call t
EDIT: As suggested in the comments below, I have now explicitly shown how the AVCaptureSession needs to be declared and initialized. It seems that a few were doing the initialization wrong or declaring AVCaptureSession as a local variable in a method. This would not work.
Following code allows to take a picture using AVCaptureSession without user input:
// Get all cameras in the application and find the frontal camera.
AVCaptureDevice *frontalCamera;
NSArray *allCameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
// Find the frontal camera.
for ( int i = 0; i < allCameras.count; i++ ) {
AVCaptureDevice *camera = [allCameras objectAtIndex:i];
if ( camera.position == AVCaptureDevicePositionFront ) {
frontalCamera = camera;
}
}
// If we did not find the camera then do not take picture.
if ( frontalCamera != nil ) {
// Start the process of getting a picture.
session = [[AVCaptureSession alloc] init];
// Setup instance of input with frontal camera and add to session.
NSError *error;
AVCaptureDeviceInput *input =
[AVCaptureDeviceInput deviceInputWithDevice:frontalCamera error:&error];
if ( !error && [session canAddInput:input] ) {
// Add frontal camera to this session.
[session addInput:input];
// We need to capture still image.
AVCaptureStillImageOutput *output = [[AVCaptureStillImageOutput alloc] init];
// Captured image. settings.
[output setOutputSettings:
[[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil]];
if ( [session canAddOutput:output] ) {
[session addOutput:output];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in output.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
// Finally take the picture
if ( videoConnection ) {
[session startRunning];
[output captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput
jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *photo = [[UIImage alloc] initWithData:imageData];
}
}];
}
}
}
}
session variable is of type AVCaptureSession and has been declared in .h file of the class (either as a property or as a private member of the class):
AVCaptureSession *session;
It will then need to be initialized somewhere for instance in the class' init method:
session = [[AVCaptureSession alloc] init]
Yes, there are two ways to do this. One, available in iOS 3.0+ is to use the UIImagePickerController
class, setting the showsCameraControls
property to NO, and setting the cameraOverlayView
property to your own custom controls. Two, available in iOS 4.0+ is to configure an AVCaptureSession
, providing it with an AVCaptureDeviceInput
using the appropriate camera device, and AVCaptureStillImageOutput
. The first approach is much simpler, and works on more iOS version, but the second approach gives you much greater control over photo resolution and file options.