I want to build an app that lets the user select an image and it outputs the \"average color\".
For example, this image:
The average color would be a gr
This is not an actual "answer" but I feel like I can give some tips about color detection, for what it's worth, so let's go.
The biggest trick for speed in your case is to resize the image to a square of reasonable dimensions.
There's no magic value because it depends if the image is noisy or not, etc, but less than 300x300 to target your method of sampling seems acceptable, for example (don't go too extreme though).
Use a fast resize method - no need to keep ratio, to antialias or anything (there's many implementations available on SO). We're counting colors, we're not interested by the aspect of what the image shows.
The speed gain we get from resizing is well worth the few cycles lost on resizing.
Second trick is to sample by stepping.
With most photos you can afford to sample every other pixel or every other line and keep the same accuracy for color detection.
You can also not sample (or discard once sampled) the borders of most photos on a few pixels wide - because of borders, frames, vignettes, etc. It helps for making averages (you want to discard all that is too marginal and could bias results unnecessarily).
To be really precise in the sampling you have to discard the noise: if you keep all the greys, all detections will be too grey. Filter out the greys by not keeping colors with a very low saturation, for example.
Then you can count your colors, and you should work on unique colors. Use for example NSCountedSet to store your colors and their occurrences, then you can work on the numbers of occurrences for each color and know the most frequent ones, etc.
Last tip: filter out lonely colors before calculating the averages - you decide the threshold (like "if it appears less than N times in a 300x300 image it's not worth using"). Helps accuracy a lot.