A time ago I already asked this question and I also got a good answer:
I\'ve been searching this forum up and down but I couldn\'t find what I reall
Wow, that blog post was something special. A whole lot of words to just state that they get the sample buffer bytes that Apple hands you back from a still image. There's nothing particularly innovative about their approach, and I know a number of camera applications that do this.
You can get at the raw bytes returned from a photo taken with a AVCaptureStillImageOutput using code like the following:
[photoOutput captureStillImageAsynchronouslyFromConnection:[[photoOutput connections] objectAtIndex:0] completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(imageSampleBuffer);
CVPixelBufferLockBaseAddress(cameraFrame, 0);
GLubyte *rawImageBytes = CVPixelBufferGetBaseAddress(cameraFrame);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(cameraFrame);
NSData *dataForRawBytes = [NSData dataWithBytes:rawImageBytes length:bytesPerRow * CVPixelBufferGetHeight(cameraFrame)];
// Do whatever with your bytes
CVPixelBufferUnlockBaseAddress(cameraFrame, 0);
}];
This will give you an NSData instance containing the uncompressed BGRA bytes returned from the camera. You can save these to disk or do whatever you want with them. If you really need to process the bytes themselves, I'd avoid the overhead of the NSData creation and just work with the byte array from the pixel buffer.
I could solve it with OpenCV. Thanks to everyone who helped me.
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
cv::Mat frame(height, width, CV_8UC4, (void*)baseAddress);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *filePath = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"ocv%d.BMP", picNum]];
const char* cPath = [filePath cStringUsingEncoding:NSMacOSRomanStringEncoding];
const cv::string newPaths = (const cv::string)cPath;
cv::imwrite(newPaths, frame);
I just have to use the imwrite function from opencv. This way I get BMP-files around 24 MB directly after the bayer-filter!
as @Wildaker mentioned, for a specific code to work you have to be sure which pixel format the camera is sending you. The code from @thomketler will work if it's set for 32-bit RGBA format.
Here is a code for the YUV default from camera, using OpenCV:
cv::Mat convertImage(CMSampleBufferRef sampleBuffer)
{
CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(cameraFrame, 0);
int w = (int)CVPixelBufferGetWidth(cameraFrame);
int h = (int)CVPixelBufferGetHeight(cameraFrame);
void *baseAddress = CVPixelBufferGetBaseAddressOfPlane(cameraFrame, 0);
cv::Mat img_buffer(h+h/2, w, CV_8UC1, (uchar *)baseAddress);
cv::Mat cam_frame;
cv::cvtColor(img_buffer, cam_frame, cv::COLOR_YUV2BGR_NV21);
cam_frame = cam_frame.t();
//End processing
CVPixelBufferUnlockBaseAddress( cameraFrame, 0 );
return cam_frame;
}
cam_frame
should have the full BGR frame. I hope that helps.
While the core of the answer comes from Brad at iOS: Get pixel-by-pixel data from camera, a key element is completely unclear from Brad's reply. It's hidden in "once you have your capture session configured...".
You need to set the correct outputSettings for your AVCaptureStillImageOutput
.
For example, setting kCVPixelBufferPixelFormatTypeKey
to kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
will give you a YCbCr imageDataSampleBuffer
in captureStillImageAsynchronouslyFromConnection:completionHandler:
, which you can then manipulate to your heart's content.