I am trying to develop an iPhone for Face recognition/detection. In my app i want to make my iPhone camera should be auto focused and auto capture.
How to recognition the face from iPhone app?
It is possible to auto focus the face and auto capture in our iPhone app. If it is possible can anyone please help to do this? I just want any suggestion/ideas and tutorials about that.
Can you please help me? Thanks in advance.
Core Image has a new CIFaceDetector to detect faces in real time; you can start with these examples to take an overview:
SquareCam
iOS Facial Recognition
Easy Face detection with Core Image
Check this code.You have to import following:- CoreImage/CoreImage.h CoreImage/CoreImage.h and after that use the code:-
-(void)markFaces:(UIImageView *)facePicture
{
// draw a CI image with the previously loaded face detection picture
CIImage* image = [CIImage imageWithCGImage:facePicture.image.CGImage];
// create a face detector - since speed is not an issue we'll use a high accuracy
// detector
CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace
context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];
// create an array containing all the detected faces from the detector
NSArray* features = [detector featuresInImage:image];
// we'll iterate through every detected face. CIFaceFeature provides us
// with the width for the entire face, and the coordinates of each eye
// and the mouth if detected. Also provided are BOOL's for the eye's and
// mouth so we can check if they already exist.
for(CIFaceFeature* faceFeature in features)
{
// get the width of the face
CGFloat faceWidth = faceFeature.bounds.size.width;
// create a UIView using the bounds of the face
UIView* faceView = [[UIView alloc] initWithFrame:faceFeature.bounds];
// add a border around the newly created UIView
faceView.layer.borderWidth = 1;
faceView.layer.borderColor = [[UIColor redColor] CGColor];
// add the new view to create a box around the face
[self.window addSubview:faceView];
if(faceFeature.hasLeftEyePosition)
{
// create a UIView with a size based on the width of the face
UIView* leftEyeView = [[UIView alloc] initWithFrame:CGRectMake(faceFeature.leftEyePosition.x-faceWidth*0.15, faceFeature.leftEyePosition.y-faceWidth*0.15, faceWidth*0.3, faceWidth*0.3)];
// change the background color of the eye view
[leftEyeView setBackgroundColor:[[UIColor blueColor] colorWithAlphaComponent:0.3]];
// set the position of the leftEyeView based on the face
[leftEyeView setCenter:faceFeature.leftEyePosition];
// round the corners
leftEyeView.layer.cornerRadius = faceWidth*0.15;
// add the view to the window
[self.window addSubview:leftEyeView];
}
if(faceFeature.hasRightEyePosition)
{
// create a UIView with a size based on the width of the face
UIView* leftEye = [[UIView alloc] initWithFrame:CGRectMake(faceFeature.rightEyePosition.x-faceWidth*0.15, faceFeature.rightEyePosition.y-faceWidth*0.15, faceWidth*0.3, faceWidth*0.3)];
// change the background color of the eye view
[leftEye setBackgroundColor:[[UIColor blueColor] colorWithAlphaComponent:0.3]];
// set the position of the rightEyeView based on the face
[leftEye setCenter:faceFeature.rightEyePosition];
// round the corners
leftEye.layer.cornerRadius = faceWidth*0.15;
// add the new view to the window
[self.window addSubview:leftEye];
}
if(faceFeature.hasMouthPosition)
{
// create a UIView with a size based on the width of the face
UIView* mouth = [[UIView alloc] initWithFrame:CGRectMake(faceFeature.mouthPosition.x-faceWidth*0.2, faceFeature.mouthPosition.y-faceWidth*0.2, faceWidth*0.4, faceWidth*0.4)];
// change the background color for the mouth to green
[mouth setBackgroundColor:[[UIColor greenColor] colorWithAlphaComponent:0.3]];
// set the position of the mouthView based on the face
[mouth setCenter:faceFeature.mouthPosition];
// round the corners
mouth.layer.cornerRadius = faceWidth*0.2;
// add the new view to the window
[self.window addSubview:mouth];
}
}
}
-(void)faceDetector
{
// Load the picture for face detection
UIImageView* image = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"facedetectionpic.jpg"]];
// Draw the face detection image
[self.window addSubview:image];
// Execute the method used to markFaces in background
[self performSelectorInBackground:@selector(markFaces:) withObject:image];
// flip image on y-axis to match coordinate system used by core image
[image setTransform:CGAffineTransformMakeScale(1, -1)];
// flip the entire window to make everything right side up
[self.window setTransform:CGAffineTransformMakeScale(1, -1)];
}
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
// Override point for customization after application launch.
self.viewController = [[ViewController alloc] initWithNibName:@"ViewController" bundle:nil];
self.window.rootViewController = self.viewController;
[self.window makeKeyAndVisible];
[self faceDetector]; // execute the faceDetector code
return YES;
}
Hope it helps thanks :)
来源:https://stackoverflow.com/questions/10496724/how-to-develop-a-face-recognition-iphone-app