Eigenvectors computed with numpy's eigh and svd do not match

前端 未结 1 1614
青春惊慌失措
青春惊慌失措 2021-02-04 05:17

Consider singular value decomposition M=USV*. Then the eigenvalue decomposition of M* M gives M* M= V (S* S) V*=VS* U* USV*. I wish to verify this equality with numpy by showing

相关标签:
1条回答
  • 2021-02-04 05:58

    Just play with small numbers to debug your problem.

    Start with A=np.random.randn(3,2) instead of your much larger matrix with size (50,20)

    In my random case, I find that

    v1 = array([[-0.33872745,  0.94088454],
       [-0.94088454, -0.33872745]])
    

    and for v2:

    v2 = array([[ 0.33872745, -0.94088454],
       [ 0.94088454,  0.33872745]])
    

    they only differ for a sign, and obviously, even if normalized to have unit module, the vector can differ for a sign.

    Now if you try the trick

    assert np.all(np.isclose(V1,-1*V2))
    

    for your original big matrix, it fails... again, this is OK. What happens is that some vectors have been multiplied by -1, some others haven't.

    A correct way to check for equality between the vectors is:

    assert allclose(abs((V1*V2).sum(0)),1.)
    

    and indeed, to get a feeling of how this works you can print this quantity:

    (V1*V2).sum(0)
    

    that indeed is either +1 or -1 depending on the vector:

    array([ 1., -1.,  1.,  1.,  1.,  1.,  1.,  1.,  1.,  1.,  1.,  1.,  1.,
        1., -1.,  1.,  1.,  1., -1., -1.])
    

    EDIT: This will happen in most cases, especially if starting from a random matrix. Notice however that this test will likely fail if one or more eigenvalues has an eigenspace of dimension larger than 1, as pointed out by @Sven Marnach in his comment below:

    There might be other differences than just vectors multiplied by -1. If any of the eigenvalues has a multi-dimensional eigenspace, you might get an arbitrary orthonormal basis of that eigenspace, and to such bases might be rotated against each other by an arbitraty unitarian matrix

    0 讨论(0)
提交回复
热议问题