iOS — detect the color of a pixel?

穿精又带淫゛_ 提交于 2019-11-26 19:15:44

问题


For example, suppose I want to detect the color of the pixel with screen coordinates (100, 200). Is there a way to do this?

EDIT -- I'm not worried about retina display issues for now.


回答1:


This may not be the most direct route, but you could:

  1. Use UIGraphicsBeginImageContextWithOptions to grab the screen (see the Apple Q&A QA1703 - "Screen Capture in UIKit Applications").

  2. Then use CGImageCreateWithImageInRect to grab the portion of the resultant image you require.

  3. Finally analyse the resultant image. It gets complicated at this point, but thankfully there's an existing question that should show you the way: How to get the RGB values for a pixel on an image on the iphone

Alternatively, there's the following blog article that has accompanying code: What Color is My Pixel? Image based color picker on iPhone




回答2:


Here is how to do it

CGContextRef ctx;
CGImageRef imageRef = [image CGImage];
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *rawData = malloc(height * width * 4);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height,
                                             bitsPerComponent, bytesPerRow, colorSpace,
                                             kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);

CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);

//GET PIXEL FROM POINT
int index = 4*((width*round(yCoor))+round(xCoor));

int R = rawData[index];
int G = rawData[index+1];
int B = rawData[index+2];

NSLog(@"%d   %d   %d", R, G, B);

//IF YOU WANT TO ALTER THE PIXELS

int byteIndex = 0;

for(int ii = 0 ; ii < width * height ; ++ii)
{ 
    rawData[byteIndex] = (char)(newPixelValue);
    rawData[byteIndex+1] = (char)(newPixelValue);
    rawData[byteIndex+2] = (char)(newPixelValue);

    byteIndex += 4;
}


ctx = CGBitmapContextCreate(rawData,
                            CGImageGetWidth( imageRef ),
                            CGImageGetHeight( imageRef ),
                            8,
                            CGImageGetBytesPerRow( imageRef ),
                            CGImageGetColorSpace( imageRef ),
                            kCGImageAlphaPremultipliedLast ); 

imageRef = CGBitmapContextCreateImage(ctx);
UIImage* rawImage = [UIImage imageWithCGImage:imageRef];  

CGContextRelease(ctx);  

image = rawImage;  

free(rawData);



回答3:


Try This One Where "self.m_imgvwSource" is the uiview/uiimageview as per your need

- (UIColor *) GetCurrentPixelColorAtPoint:(CGPoint)point
{
    unsigned char pixel[4] = {0};

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    CGContextRef context = CGBitmapContextCreate(pixel, 1, 1, 8, 4, colorSpace, kCGBitmapAlphaInfoMask & kCGImageAlphaPremultipliedLast);

    CGContextTranslateCTM(context, -point.x, -point.y);

    [self.m_imgvwSource.layer renderInContext:context];

    CGContextRelease(context);

    CGColorSpaceRelease(colorSpace);

    NSLog(@"pixel: %d %d %d %d", pixel[0], pixel[1], pixel[2], pixel[3]);

    UIColor *color = [UIColor colorWithRed:pixel[0]/255.0 green:pixel[1]/255.0 blue:pixel[2]/255.0 alpha:pixel[3]/255.0];

    return color;
}


来源:https://stackoverflow.com/questions/4616778/ios-detect-the-color-of-a-pixel

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!