问题
I am writing this method to calculate the average R,G,B values of an image. The following method takes a UIImage as an input and returns an array containing the R,G,B values of the input image. I have one question though: How/Where do I properly release the CGImageRef?
-(NSArray *)getAverageRGBValuesFromImage:(UIImage *)image
{
CGImageRef rawImageRef = [image CGImage];
//This function returns the raw pixel values
const UInt8 *rawPixelData = CFDataGetBytePtr(CGDataProviderCopyData(CGImageGetDataProvider(rawImageRef)));
NSUInteger imageHeight = CGImageGetHeight(rawImageRef);
NSUInteger imageWidth = CGImageGetWidth(rawImageRef);
//Here I sort the R,G,B, values and get the average over the whole image
int i = 0;
unsigned int red = 0;
unsigned int green = 0;
unsigned int blue = 0;
for (int column = 0; column< imageWidth; column++)
{
int r_temp = 0;
int g_temp = 0;
int b_temp = 0;
for (int row = 0; row < imageHeight; row++) {
i = (row * imageWidth + column)*4;
r_temp += (unsigned int)rawPixelData[i];
g_temp += (unsigned int)rawPixelData[i+1];
b_temp += (unsigned int)rawPixelData[i+2];
}
red += r_temp;
green += g_temp;
blue += b_temp;
}
NSNumber *averageRed = [NSNumber numberWithFloat:(1.0*red)/(imageHeight*imageWidth)];
NSNumber *averageGreen = [NSNumber numberWithFloat:(1.0*green)/(imageHeight*imageWidth)];
NSNumber *averageBlue = [NSNumber numberWithFloat:(1.0*blue)/(imageHeight*imageWidth)];
//Then I store the result in an array
NSArray *result = [NSArray arrayWithObjects:averageRed,averageGreen,averageBlue, nil];
return result;
}
I tried two things: Option 1: I leave it as it is, but then after a few cycles (5+) the program crashes and I get the \"low memory warning error\"
Option 2: I add one line CGImageRelease(rawImageRef) before the method returns. Now it crashes after the second cycle, I get the EXC_BAD_ACCESS error for the UIImage that I pass to the method. When I try to analyze (instead of RUN) in Xcode I get the following warning at this line \"Incorrect decrement of the reference count of an object that is not owned at this point by the caller\"
Where and how should I release the CGImageRef?
Thanks!
回答1:
Your memory issue results from the copied data, as others have stated. But here's another idea: Use Core Graphics's optimized pixel interpolation to calculate the average.
- Create a 1x1 bitmap context.
- Set the interpolation quality to medium (see later).
- Draw your image scaled down to exactly this one pixel.
- Read the RGB value from the context's buffer.
- (Release the context, of course.)
This might result in better performance because Core Graphics is highly optimized and might even use the GPU for the downscaling.
Testing showed that medium quality seems to interpolate pixels by taking the average of color values. That's what we want here.
Worth a try, at least.
Edit: OK, this idea seemed too interesting not to try. So here's an example project showing the difference. Below measurements were taken with the contained 512x512 test image, but you can change the image if you want.
It takes about 12.2 ms to calculate the average by iterating over all pixels in the image data. The draw-to-one-pixel approach takes 3 ms, so it's roughly 4 times faster. It seems to produce the same results when using kCGInterpolationQualityMedium
.
I assume that the huge performance gain is a result from Quartz noticing that it does not have to decompress the JPEG fully but that it can use the lower frequency parts of the DCT only. That's an interesting optimization strategy when composing JPEG compressed pixels with a scale below 0.5. But I'm only guessing here.
Interestingly, when using your method, 70% of the time is spent in CGDataProviderCopyData
and only 30% in the pixel data traversal. This hints to a lot of time spent in JPEG decompression.
回答2:
You don't own the CGImageRef
rawImageRef
because you obtain it using [image CGImage]
. So you don't need to release it.
However, you own rawPixelData
because you obtained it using CGDataProviderCopyData and must release it.
CGDataProviderCopyData
Return Value: A new data object containing a copy of the provider’s data. You are responsible for releasing this object.
回答3:
I believe your issue is in this statement:
const UInt8 *rawPixelData = CFDataGetBytePtr(CGDataProviderCopyData(CGImageGetDataProvider(rawImageRef)));
You should be releasing the return value of CGDataProviderCopyData.
回答4:
Your mergedColor works great on an image loaded from a file, but not for an image capture by the camera. Because CGBitmapContextGetData()
on the context created from a captured sample buffer doesn't return it bitmap. I changed your code to as following. It works on any image and it is as fast as your code.
- (UIColor *)mergedColor
{
CGImageRef rawImageRef = [self CGImage];
// scale image to an one pixel image
uint8_t bitmapData[4];
int bitmapByteCount;
int bitmapBytesPerRow;
int width = 1;
int height = 1;
bitmapBytesPerRow = (width * 4);
bitmapByteCount = (bitmapBytesPerRow * height);
memset(bitmapData, 0, bitmapByteCount);
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate (bitmapData,width,height,8,bitmapBytesPerRow,
colorspace,kCGBitmapByteOrder32Little|kCGImageAlphaPremultipliedFirst);
CGColorSpaceRelease(colorspace);
CGContextSetBlendMode(context, kCGBlendModeCopy);
CGContextSetInterpolationQuality(context, kCGInterpolationMedium);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), rawImageRef);
CGContextRelease(context);
return [UIColor colorWithRed:bitmapData[2] / 255.0f
green:bitmapData[1] / 255.0f
blue:bitmapData[0] / 255.0f
alpha:1];
}
回答5:
CFDataRef abgrData = CGDataProviderCopyData(CGImageGetDataProvider(rawImageRef));
const UInt8 *rawPixelData = CFDataGetBytePtr(abgrData);
...
CFRelease(abgrData);
来源:https://stackoverflow.com/questions/12147779/how-do-i-release-a-cgimageref-in-ios