Why do I get the wrong color of a pixel with following code?

后端 未结 1 1651
礼貌的吻别
礼貌的吻别 2020-12-17 03:43

I create an UIImage with backgroundcolor RED:

let theimage:UIImage=imageWithColor(UIColor(red: 1, green: 0, blue: 0, alpha: 1) );

func imag         


        
相关标签:
1条回答
  • 2020-12-17 04:19

    The problem is not the built-in getRed function, but rather the function that builds the UIColor object from the individual color components in the provider data. Your code is assuming that the provider data is stored in RGBA format, but it apparently is not. It would appear to be in ARGB format. Also, I'm not sure you have the byte order right, either.

    When you have an image, there are a variety of ways of packing those into the provider data. A few examples are shown in the Quartz 2D Programming Guide:

    If you're going to have a getPixelColor routine that is hard-coded for a particular format, I might check the alphaInfo and bitmapInfo like so (in Swift 4.2):

    extension UIImage {
        func getPixelColor(point: CGPoint) -> UIColor? {
            guard let cgImage = cgImage,
                let pixelData = cgImage.dataProvider?.data
                else { return nil }
    
            let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)
    
            let alphaInfo = cgImage.alphaInfo
            assert(alphaInfo == .premultipliedFirst || alphaInfo == .first || alphaInfo == .noneSkipFirst, "This routine expects alpha to be first component")
    
            let byteOrderInfo = cgImage.byteOrderInfo
            assert(byteOrderInfo == .order32Little || byteOrderInfo == .orderDefault, "This routine expects little-endian 32bit format")
    
            let bytesPerRow = cgImage.bytesPerRow
            let pixelInfo = Int(point.y) * bytesPerRow + Int(point.x) * 4;
    
            let a: CGFloat = CGFloat(data[pixelInfo+3]) / 255
            let r: CGFloat = CGFloat(data[pixelInfo+2]) / 255
            let g: CGFloat = CGFloat(data[pixelInfo+1]) / 255
            let b: CGFloat = CGFloat(data[pixelInfo  ]) / 255
    
            return UIColor(red: r, green: g, blue: b, alpha: a)
        }
    }
    

    And if you were to always build this image programmatically for code that is dependent upon the bit map info, I'd explicitly specify these details when I created the image:

    func image(with color: UIColor, size: CGSize) -> UIImage? {
        let rect = CGRect(origin: .zero, size: size)
        let colorSpace = CGColorSpaceCreateDeviceRGB()
        guard let context = CGContext(data: nil,
                                      width: Int(rect.width),
                                      height: Int(rect.height),
                                      bitsPerComponent: 8,
                                      bytesPerRow: Int(rect.width) * 4,
                                      space: colorSpace,
                                      bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue) else {
            return nil
        }
        context.setFillColor(color.cgColor)
        context.fill(rect)
        return context.makeImage().flatMap { UIImage(cgImage: $0) }
    }
    

    Perhaps even better, as shown in Technical Q&A 1509, you might want to have getPixelData explicitly create its own context of a predetermined format, draw the image to that context, and now the code is not contingent upon the format of the original image to which you are applying this.

    extension UIImage {
    
        func getPixelColor(point: CGPoint) -> UIColor? {
            guard let cgImage = cgImage else { return nil }
    
            let width = Int(size.width)
            let height = Int(size.height)
            let colorSpace = CGColorSpaceCreateDeviceRGB()
    
            guard let context = CGContext(data: nil,
                                          width: width,
                                          height: height,
                                          bitsPerComponent: 8,
                                          bytesPerRow: width * 4,
                                          space: colorSpace,
                                          bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)
                else {
                    return nil
            }
    
            context.draw(cgImage, in: CGRect(origin: .zero, size: size))
    
            guard let pixelBuffer = context.data else { return nil }
    
            let pointer = pixelBuffer.bindMemory(to: UInt32.self, capacity: width * height)
            let pixel = pointer[Int(point.y) * width + Int(point.x)]
    
            let r: CGFloat = CGFloat(red(for: pixel))   / 255
            let g: CGFloat = CGFloat(green(for: pixel)) / 255
            let b: CGFloat = CGFloat(blue(for: pixel))  / 255
            let a: CGFloat = CGFloat(alpha(for: pixel)) / 255
    
            return UIColor(red: r, green: g, blue: b, alpha: a)
        }
    
        private func alpha(for pixelData: UInt32) -> UInt8 {
            return UInt8((pixelData >> 24) & 255)
        }
    
        private func red(for pixelData: UInt32) -> UInt8 {
            return UInt8((pixelData >> 16) & 255)
        }
    
        private func green(for pixelData: UInt32) -> UInt8 {
            return UInt8((pixelData >> 8) & 255)
        }
    
        private func blue(for pixelData: UInt32) -> UInt8 {
            return UInt8((pixelData >> 0) & 255)
        }
    
        private func rgba(red: UInt8, green: UInt8, blue: UInt8, alpha: UInt8) -> UInt32 {
            return (UInt32(alpha) << 24) | (UInt32(red) << 16) | (UInt32(green) << 8) | (UInt32(blue) << 0)
        }
    
    }
    

    Clearly, if you're going to check a bunch of pixels, you'll want to refactor this (decouple the creation of the standardized pixel buffer from the code that checks the color), but hopefully this illustrates the idea.


    For earlier versions of Swift, see previous revision of this answer.

    0 讨论(0)
提交回复
热议问题