I have a 2D bitmap-like array of let\'s say 500*500 values. I\'m trying to create a linear gradient on the array, so the resulting bitmap would look something like this (in gray
There are two parts to this problem.
Given two colors A and B and some percentage p, determine what color lies p 'percent of the way' from A to B.
Given a point on a plane, find the orthogonal projection of that point onto a given line.
The given line in part 2 is your gradient line. Given any point P, project it onto the gradient line. Let's say its projection is R. Then figure out how far R is from the starting point of your gradient segment, as a percentage of the length of the gradient segment. Use this percentage in your function from part 1 above. That's the color P should be.
Note that, contrary to what other people have said, you can't just view your colors as regular numbers in your function from part 1. That will almost certainly not do what you want. What you do depends on the color space you are using. If you want an RGB gradient, then you have to look at the red, green, and blue color components separately.
For example, if you want a color "halfway between" pure red and blue, then in hex notation you are dealing with
ff 00 00
and
00 00 ff
Probably the color you want is something like
80 00 80
which is a nice purple color. You have to average out each color component separately. If you try to just average the hex numbers 0xff0000 and 0x0000ff directly, you get 0x7F807F, which is a medium gray. I'm guessing this explains at least part of the problem with your picture above.
Alternatively if you are in the HSV color space, you may want to adjust the hue component only, and leave the others as they are.