问题
I have been successfully working with the Haar algorithm in OpenCV-2.1.0 (cvHaarDetectObjects) to detect faces in pictures and video frames from within an Objective-C project for iOS 4.2. However, the processing time for the video frames still takes about 1-2 seconds on the iPhone 4 under most conditions. An example of the code I am using is given below:
NSString *path = [[NSBundle mainBundle] pathForResource:@"haarcascade_frontalface_alt" ofType:@"xml"];
CvHaarClassifierCascade* cascade =
(CvHaarClassifierCascade*)cvLoad([path cStringUsingEncoding:NSASCIIStringEncoding],
NULL, NULL, NULL);
CvMemStorage* storage = cvCreateMemStorage(0);
CvSeq* faces = cvHaarDetectObjects(small_image, cascade, storage, 1.2, 0,
0 |CV_HAAR_DO_ROUGH_SEARCH |CV_HAAR_FIND_BIGGEST_OBJECT, cvSize(30, 30));
I have tried multiple optimization techniques, including smart application of ROI, and the use of integers rather than floats. Yet these changes have taken vast amounts of time and had only a minor benefit.
It has been suggested to me that utilisation of LBP could significantly reduce the face detection time. I have been experimenting with and searching for ways to implement LBP, but to no avail. In opencv, there is a cascade file (lbpcascade_frontalface.xml), yet I cannot find any suggestions for how to use it.
Any help would be appreciated, including other optimization techniques and Google links that I may have missed in my searching. Accuracy of detection is not critical so long as it is reasonably effective.
Thanks!
回答1:
Try using Instruments to determine where the performance bottlenecks are in your application. Chances are they are different from what you think they might be.
Also, check out this performance guide.
来源:https://stackoverflow.com/questions/4921260/face-detection-on-iphone-using-opencv-and-lbp