Iphone convert color image to 2bit image (black-white)

后端 未结 3 894
半阙折子戏
半阙折子戏 2021-01-01 08:50

I need help to convert a color image to \"black and white\", not grayscale.

I what to do this with the iPhone SDK and Core Graphics, as I\'m convinced this is possib

相关标签:
3条回答
  • 2021-01-01 08:53

    I thought I would just throw it out there that if you did not want to threshold the image, the code would look like ==>

    intensity = (pixelBuffer[index] + pixelBuffer[index + 1] + pixelBuffer[index + 2]) / 3.0;
    
    pixelBuffer[index] = (int)intensity;
    pixelBuffer[index + 1] = (int)intensity;  
    pixelBuffer[index + 2] = (int)intensity;
    
    0 讨论(0)
  • 2021-01-01 08:57

    You can use the CGBitmapContextCreate method to get the data from an image.

    Have a look at his link : https://stackoverflow.com/a/448758/1863223

    0 讨论(0)
  • 2021-01-01 09:18

    This is possible and I previously tried two paths:

    1. Convert to Grayscale then apply pixel by pixel conversion to B&W.
      --Problem with this is I don't get good results with images with transparency.

    2. If you're not very strict, given an RGBA image, get the pixel RGB average and convert to B&W with a supplied threshold and retain its transparency. Technically this is still RGBA but more of Black, White and Transparency.

    e.g.

      UIImage *originalImage = [UIImage imageNamed:@"option_bluetooth.png"];  
    
      unsigned char *pixelBuffer = [self getPixelData:originalImage.CGImage]; 
      size_t length = originalImage.size.width * originalImage.size.height * 4;
      CGFloat intensity;
      int bw;
      //50% threshold
      const CGFloat THRESHOLD = 0.5;
      for (int index = 0; index < length; index += 4)  
      {  
        intensity = (pixelBuffer[index] + pixelBuffer[index + 1] + pixelBuffer[index + 2]) / 3. / 255.;
        if (intensity > THRESHOLD) {
          bw = 255;
        } else {
          bw = 0;
        }
        pixelBuffer[index] = bw;
    
    
        pixelBuffer[index + 1] = bw;  
        pixelBuffer[index + 2] = bw;
      }
    
      CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB();
      CGContextRef bitmapContext=CGBitmapContextCreate(pixelBuffer, originalImage.size.width, originalImage.size.height, 8, 4*originalImage.size.width, colorSpace,  kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault);
      CFRelease(colorSpace);
      free(pixelBuffer);
      CGImageRef cgImage=CGBitmapContextCreateImage(bitmapContext);
      CGContextRelease(bitmapContext);
    
      UIImage *bwImage = [UIImage imageWithCGImage:cgImage];
    

    I get the pixel data by writing to an offscreen context (the new way of getting the raw data which Apple suggests does not work for me)

    e.g.

      CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    
      unsigned char *rawData = malloc(imageHeight * imageWidth * 4);
      CGContextRef offscreenContext = CGBitmapContextCreate(rawData, imageWidth, imageHeight,
                                                            bitsPerComponent, bytesPerRow, colorSpace,
                                                            kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault);
    
      CGColorSpaceRelease(colorSpace);
    
      CGContextDrawImage(offscreenContext, CGRectMake(0, 0, imageWidth, imageHeight), cgCropped);
      CGContextRelease(offscreenContext);
    

    Here's the code to get Pixel Data

    + (unsigned char *) getPixelData: (CGImageRef) cgCropped {
    
      size_t imageWidth = CGImageGetWidth(cgCropped);
    
      size_t imageHeight = CGImageGetHeight(cgCropped);
      size_t bitsPerComponent = 8;
      size_t bytesPerPixel = 4;
      size_t bytesPerRow = bytesPerPixel * imageWidth;
      CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    
      unsigned char *rawData = malloc(imageHeight * imageWidth * 4);
      CGContextRef offscreenContext = CGBitmapContextCreate(rawData, imageWidth, imageHeight,
                                                            bitsPerComponent, bytesPerRow, colorSpace,
                                                            kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault);
    
      CGColorSpaceRelease(colorSpace);
    
      CGContextDrawImage(offscreenContext, CGRectMake(0, 0, imageWidth, imageHeight), cgCropped);
      CGContextRelease(offscreenContext);
    
      return rawData;
    }
    
    0 讨论(0)
提交回复
热议问题