How to detect colors under different illumination conditions

前端 未结 1 1376
你的背包
你的背包 2021-01-01 00:41

I have a bunch of images of clothes of many colors and I want to detect the colors of each image. Say that I have a blue skirt image in daylight conditions and I can get the

相关标签:
1条回答
  • 2021-01-01 01:18

    That's a very hard problem and it's still trying to be solved today. The gist of it is to find a colour quantization using a representative set of basic colours of an image that is robust against different external stimuli... lighting, shade, poor illumination etc.

    Unfortunately I can't suggest any one algorithm that would do the work for you for all cases. However, one algorithm that has worked for me in the past was when I was doing work in image retrieval. Specifically, the work by Jiebo Luo and David Crandall from Kodak Research Labs: http://vision.soic.indiana.edu/papers/compoundcolor2004cvpr.pdf

    The basic algorithm is to take a look at the ISCC-NBS colour palette set. Also, this link is much more fruitful: https://www.w3schools.com/colors/colors_nbs.asp. It is a set of 267 colours that are representative of the colours that we see in modern society today. Usually when we describe colours, we have a set of one or more adjectives, followed by the dominant hue. For example, that shirt is a darkish pale blue, or a light bright yellow, etc. The beauty of this algorithm is that when the colour in question is subject to different external stimuli, we have all of these adjectives that give meaning to the colour, but at the end of the day, the last part of the colour - the dominant hue - is what we're after.

    Each of these colours has an associated RGB value. These colours are transformed into the CIE Lab colour space which form a 267 CIE Lab lookup table.

    To classify a particular input colour, you would transform this input's RGB values into the CIE Lab colour space, then determine the closest colour to this lookup table. It has been shown that the Euclidean distance between two colours in the CIE Lab colour space best represents the difference in human perception of colours. Once we determine which location in the lookup table the colour is closest to, we strip out all of the adjectives and see what the dominant hue is and we thus classify that colour accordingly.

    For example, if we had a RGB pixel and we converted it to Lab, then found that the closest colour was bright yellow, we would remove the "bright" and the final colour that is representative of that RGB pixel would be yellow.


    Therefore, the final algorithm is this:

    1. Find the ISCC-NBS colour set's RGB values and convert to CIE Lab and create a lookup table, which I call LUT1. In Python for example, you could simply make this a 2D list or 2D NumPy array.
    2. Create another lookup that stores the dominant hue for each of the colours in the ISCC-NBS colour set - so strip out all of the adjectives and leave the dominant hue, which I call LUT2. In Python for example, you could create a dictionary where the key is the corresponding row of LUT1 and the value would be the actual basic colour itself. Whether it's a string representation or a RGB triplet representing the basic colour is up to you.
    3. For a pixel in question, find the closest ISCC-NBS colour that matches with LUT1 by the Euclidean distance between this pixel's CIE Lab components and the ones in LUT1.
    4. Once we find this location in LUT1, use the same index to index into LUT2 and get the final colour to classify that input pixel's colour.

    Hope this helps!

    0 讨论(0)
提交回复
热议问题