Given set of points in 3D ( X = (x1, x2, x3), Y = (y1, y2, y3) ), how can I fit transformation from X to Y?
As far as I know this is called projective transformati
So, the task is to find best fitting linear transformation, right?
There is a simple solution using linear regression.
Say the transformation matrix is named A
and has dimensions 3x3. And say you have N
vectors (points) in 3D before and after the transformation - so you have matrices X and Y of 3 rows and N columns. Then the transformation is:
Y = A X + B
where B is a vector of length 3 and specifies the shift. You can rewrite the matrix multiplication using indices:
y[i,j] = sum(k=1..3)(a[i,k] * x[k,j]) + b[i]
for i = 1..3 and j = 1 .. N. So, you have 12 unknown variables (a, b), and 3 * N equations. For N >= 4, you simply find the best solution using linear regression.
For example, in R it is very easy:
# input data
X = matrix(c(c(0, 0, 0), c(1, 0, 0), c(0, 1, 0), c(0, 1, 1)), nrow = 3)
Y = matrix(c(c(1, 0, 1), c(2, 0, 1), c(1, 1, 1), c(1, 1, 2)), nrow = 3)
# expected transformation: A is identity matrix, b is [1, 0, 1]
N = dim(Y)[2]
# transform data for regression
a1 = rbind(t(X), matrix(rep(0, 3*2*N), ncol = 3))
a2 = rbind(matrix(rep(0, 3*N), ncol = 3), t(X), matrix(rep(0, 3*N), ncol = 3))
a3 = rbind(matrix(rep(0, 3*2*N), ncol = 3), t(X))
b1 = rep(1:0, c(N, 2*N))
b2 = rep(c(0, 1, 0), each = N)
b3 = rep(0:1, c(2*N, N))
y = as.vector(t(Y))
# do the regression
summary(lm(y ~ 0 + a1 + a2 + a3 + b1 + b2 + b3))
And the output is:
[...]
Coefficients:
Estimate Std. Error t value Pr(>|t|)
a11 1.000e+00 NA NA NA
a12 -2.220e-16 NA NA NA
a13 -3.612e-32 NA NA NA
a21 7.850e-17 NA NA NA
a22 1.000e+00 NA NA NA
a23 -1.743e-32 NA NA NA
a31 0.000e+00 NA NA NA
a32 0.000e+00 NA NA NA
a33 1.000e+00 NA NA NA
b1 1.000e+00 NA NA NA
b2 -7.850e-17 NA NA NA
b3 1.000e+00 NA NA NA
Residual standard error: NaN on 0 degrees of freedom
Multiple R-squared: 1, Adjusted R-squared: NaN
F-statistic: NaN on 12 and 0 DF, p-value: NA
as expected.