I have a couple of USB webcams (fixed focal length) setup as a simple stereoscopic rangefinder, spaced N mm apart with each rotated by M degrees towards the centerline, and
the problem is that you can not assume pixel perfect align of cameras
so let assume x
-axis is the parallax shifted axis and y
- axis is aligned. You need to identify the x-axis image distortion/shift to detect parallax align even if you are aligned as much as possible. The result of abs difference is not guaranteed to be in min/max
so instead of substracting individual pixels substract average color of nearby area of that pixel with radius/size bigger then the align error in y-axis
. Let call this radius or size r
this way the resulting difference should be minimal when aligned.
Approximation search
You can even speed up the process by r
r
0.25*r
x0
)r
to half<x0-2.0*r,x0+2.0r>
r
is smaller then few pixelsThis way you can search in O(log2(n))
instead of O(n)
computer vision approach
this should be even faster:
This way you can avoid checking whole x-range because the align distance is obtained directly ... You just need to convert it to angle or what ever you use to align parallax
[notes]
You do not need to do this on whole image area just select few horizontal lines along the images and scan their nearby area.
There are also another ways to detect align for example for short distances the skew is significant marker of align so compare the height of object on its left and right side between cameras ... If near the same you are aligned if bigger/smaller you are not aligned and know which way to turn ...