问题
When I use the following code:
UIImage *image=[UIImage imageNamed:@"loginf1.png"];
CGImageRef rawImageRef=image.CGImage;
const float colorMasking[6] = {222, 255, 222, 255, 222, 255};
CGImageRef maskedImageRef=CGImageCreateWithMaskingColors(rawImageRef, colorMasking);
maskedImageRef
is always nil. Why is this, and what can I do to correct this?
回答1:
I had the same problem. The CGImageRef
you are creating has only 6 bytes for each pixel with byte with no alpha channel. The masking function needs a CGImageRef
with 8 bytes for each pixel, only 6 of them used, with no alpha channel. At least, I think this is what's causing it.
Anyway, fix it by creating a bitmap context, drawing your image to that bitmap context, then getting your CGImageRef from CGBitmapContextCreateImage
.
回答2:
The reason it is failing is because you CANNOT create a mask image with an alpha channel. Unfortunately what you are trying to is NOT possible.
The only way to use "CGImageCreateWithMaskingColors(...)" it to provide it a bitmap context WITHOUT an alpha channel. The catch 22 here is that it is NOT possible to create a bitmap context WITHOUT an alpha channel. Don't you just love Apple?
来源:https://stackoverflow.com/questions/3563770/why-is-cgimagecreatewithmaskingcolors-returning-nil-in-this-case