I am looking at making an app that uses a camera to measure the amount of light present when an image is taken. Some conditional behavior would take place based on how much
First random thought, something like a threshold filter to remove "objects" take whats left from the source image and threshold it again, count white vs black pixels, should give you a VERY rough idea of light vs dark. Thresholding
A very rough estimate could be made by converting each colour channel value of each pixel to its intensity, using the known or assumed gamma of the camera. Then just sum the intensities across the whole image.
If you want the level to approximate that observed by a human, you will weight the green channel intensity higher and the blue channel lower (since our eyes are particularly sensitive to green, and insensitive to blue).
If it needs to be very accurate, you should consider using some regression techniques such as least squares (in particular, partial least squares). With these methods, you have a set of training data (references for several times), and it compares and interpolates to find an approximation of the time of day.
If it doesn't need to be that accurate, you can get a good approximation by simply calculating the average pixel distance from white/black. If the camera has auto-exposure, you can factor that in too.