Is there some robust metric of image sharpness or bluriness? I have various set of images with different parameters of saturation and captured from different optical systems
Autofocus is an interesting problem on its own, and so evaluating sharpness across arbitrary images is another level of complexity.
On sharpness evaluation, I suggest this paper from Cornell. Their conclusion was that the variance metric provided the best evaluation of a given image. And it doesn't hurt that it's really easy to calculate!
For creating a consistent metric across different images, you'll need a way to normalize. The metric might be in units of variance per pixel. You could take advantage of the fact that lack of focus provides an upper bound on variance, and so look for clustering at a maximal rate of local variance.
You can calculate the accutance of the image by calculating the mean of the Gradient Filter.
Reference this StackOverflow answer to a similar question.
You need a no-reference sharpness metric, such as:
Here's a short paper describing a method for detecting blurredness using a Haar Wavelet Transform
The other answers to this PAQ may also be helpful.