I have lots of large (around 5000 x 5000) matrices that I need to invert in Matlab. I actually need the inverse, so I can\'t use mldivide instead, which is a lot faster for
I actually need the inverse, so I can't use mldivide instead,...
That's not true, because you can still use mldivide
to get the inverse. Note that A-1 = A-1 * I
. In MATLAB, this is equivalent to
invA = A\speye(size(A));
On my machine, this takes about 10.5 seconds for a 5000x5000
matrix. Note that MATLAB does have an inv function to compute the inverse of a matrix. Although this will take about the same amount of time, it is less efficient in terms of numerical accuracy (more info in the link).
First off, their determinant is 1 so they're definitely invertible
Rather than det(A)=1
, it is the condition number of your matrix that dictates how accurate or stable the inverse will be. Note that det(A)=∏i=1:n λi
. So just setting λ1=M
, λn=1/M
and λi≠1,n=1
will give you det(A)=1
. However, as M → ∞
, cond(A) = M2 → ∞
and λn → 0
, meaning your matrix is approaching singularity and there will be large numerical errors in computing the inverse.
My matrices are coming from a problem that means they have some nice properties.
Of course, there are other more efficient algorithms that can be employed if your matrix is sparse or has other favorable properties. But without any additional info on your specific problem, there is nothing more that can be said.
I would prefer a way to speed Matlab up
MATLAB uses Gauss elimination to compute the inverse of a general matrix (full rank, non-sparse, without any special properties) using mldivide
and this is Θ(n3)
, where n
is the size of the matrix. So, in your case, n=5000
and there are 1.25 x 1011
floating point operations. So on a reasonable machine with about 10 Gflops of computational power, you're going to require at least 12.5 seconds to compute the inverse and there is no way out of this, unless you exploit the "special properties" (if they're exploitable)
Inverting an arbitrary 5000 x 5000 matrix is not computationally easy no matter what language you are using. I would recommend looking into approximations. If your matrices are low rank, you might want to try a low-rank approximation M = USV'
Here are some more ideas from math-overflow:
https://mathoverflow.net/search?q=matrix+inversion+approximation
First suppose the eigen values are all 1
. Let A
be the Jordan canonical form of your matrix. Then you can compute A^{-1}
using only matrix multiplication and addition by
A^{-1} = I + (I-A) + (I-A)^2 + ... + (I-A)^k
where k < dim(A)
. Why does this work? Because generating functions are awesome. Recall the expansion
(1-x)^{-1} = 1/(1-x) = 1 + x + x^2 + ...
This means that we can invert (1-x)
using an infinite sum. You want to invert a matrix A
, so you want to take
A = I - X
Solving for X
gives X = I-A
. Therefore by substitution, we have
A^{-1} = (I - (I-A))^{-1} = 1 + (I-A) + (I-A)^2 + ...
Here I've just used the identity matrix I
in place of the number 1
. Now we have the problem of convergence to deal with, but this isn't actually a problem. By the assumption that A
is in Jordan form and has all eigen values equal to 1
, we know that A
is upper triangular with all 1
s on the diagonal. Therefore I-A
is upper triangular with all 0
s on the diagonal. Therefore all eigen values of I-A
are 0
, so its characteristic polynomial is x^dim(A)
and its minimal polynomial is x^{k+1}
for some k < dim(A)
. Since a matrix satisfies its minimal (and characteristic) polynomial, this means that (I-A)^{k+1} = 0
. Therefore the above series is finite, with the largest nonzero term being (I-A)^k
. So it converges.
Now, for the general case, put your matrix into Jordan form, so that you have a block triangular matrix, e.g.:
A 0 0
0 B 0
0 0 C
Where each block has a single value along the diagonal. If that value is a
for A
, then use the above trick to invert 1/a * A
, and then multiply the a
back through. Since the full matrix is block triangular the inverse will be
A^{-1} 0 0
0 B^{-1} 0
0 0 C^{-1}
There is nothing special about having three blocks, so this works no matter how many you have.
Note that this trick works whenever you have a matrix in Jordan form. The computation of the inverse in this case will be very fast in Matlab because it only involves matrix multiplication, and you can even use tricks to speed that up since you only need powers of a single matrix. This may not help you, though, if it's really costly to get the matrix into Jordan form.