问题
I try to do a thing that should be nothing more than a two-dimensional, linear interpolation but currently I fail finding the correct approach. To describe the problem a bit simplified: there is a drawing area with a size of 3000x3000 pixels where I have to draw e.g. a horizontal line. To do that I'm drawing dots or short lines from every pixel position to the next pixel position which then forms a line.
Now a correction has to be applied to the whole thing where correction information can be found in a (for this example simplified) 4 by 4 array, where every element contains a pair of coordinates describing the values after correction. So a neutral array (with no correction) would look like this:
0,0 1000,0 2000,0 3000,0
0,1000 1000,1000 2000,1000 3000,1000
0,2000 1000,2000 2000,2000 3000,2000
0,3000 1000,3000 2000,3000 3000,3000
A real correction table would contain other coordinates describing the correction to be done:
![](https://i0.wp.com/i.stack.imgur.com/5DRhg.png)
So as input data I have the coordinates of points on the line without correction, the fields values without correction and the correction data. But how can I calculate the lines points now applying the correction values to it so that a distorted line is drawn like shown in right side if the image? My current approach with two separate linear interpolations for X and Y does not work, there the Y-position jumps on a cells border but does not change smoothly within a cell.
So...any ideas how this could be done?
回答1:
You have to agree on an interpolation method first. I would suggest either bilinear or barycentric interpolation. In one of my previous posts I visualized the difference between both methods.
I'll concentrate on the bilinear interpolation. We want to transform any point within a cell to its corrected point. Therefore, all points could be transformed separately.
We need the interpolation parameters u
and v
for the point (x, y)
. Because we have an axis-aligned grid, this is pretty simple:
u = (x - leftCellEdge) / (rightCellEdge - leftCellEdge)
v = (y - bottomCellEdge) / (topCellEdge - bottomCellEdge)
We could reconstruct the point by bilinear interpolation:
p2 p4
x----x
| o |
x----x
p1 p3
o = (1 - u) * ((1 - v) * p1 + v * p2) + u * ((1 - v) * p3 + v * p4)
Now, the same formula can be used for the corrected points. If you use the original points p1
through p4
, you'll get the uncorrected line point. If you use the corrected cell points for p1
through p4
, you'll get the corrected line point.
来源:https://stackoverflow.com/questions/23077084/linear-interpolation-calculate-correction-based-on-2d-table