I am trying to write a program using opencv to calculate the distance from a webcam to a one inch white sphere. I feel like this should be pretty easy, but for whatever reason I'm drawing a blank. Thanks for the help ahead of time.
You can use triangle similarity to calibrate the camera angle and find the distance.
You know your ball's size: D
units (e.g. cm). Place it at a known distance Z
, say 1 meter = 100cm, in front of the camera and measure its apparent width in pixels. Call this width d
.
The focal length of the camera f
(which is slightly different from camera to camera) is then f=d*Z/D
.
When you see this ball again with this camera, and its apparent width is d'
pixels, then by triangle similarity, you know that f/d'=Z'/D
and thus: Z'=D*f/d'
where Z'
is the ball's current distance from the camera.
To my mind you will need a camera model = a calibration model if you want to measure distance or other things (int the real-world). The pinhole camera model is simple, linear and gives good results (but won't correct distortions, (whether they are radial or tangential).
If you don't use that, then you'll be able to compute disparity-depth map, (for instance if you use stereo vision) but it is relative and doesn't give you an absolute measurement, only what is behind and what is in front of another object....
Therefore, i think the answer is : you will need to calibrate it somehow, maybe you could ask the user to approach the sphere to the camera till all the image plane is perfectly filled with the ball, and with a prior known of the ball measurement, you'll be able to then compute the distance....
Julien,
来源:https://stackoverflow.com/questions/6714069/finding-distance-from-camera-to-object-of-known-size