Capturing high multi-collinearity in statsmodels

后端 未结 2 922
清酒与你
清酒与你 2020-12-04 13:29

Say I fit a model in statsmodels

mod = smf.ols(\'dependent ~ first_category + second_category + other\', data=df).fit()

When I do mod

相关标签:
2条回答
  • 2020-12-04 14:03

    Based on a similar question for R, there are some other options that may help people. I was looking for a single number that captured the collinearity, and options include the determinant and condition number of the correlation matrix.

    According to one of the R answers, determinant of the correlation matrix will "range from 0 (Perfect Collinearity) to 1 (No Collinearity)". I found the bounded range helpful.

    Translated example for determinant:

    import numpy as np
    import pandas as pd
    
    # Create a sample random dataframe
    np.random.seed(321)
    x1 = np.random.rand(100)
    x2 = np.random.rand(100)
    x3 = np.random.rand(100)
    df = pd.DataFrame({'x1': x1, 'x2': x2, 'x3': x3})
    
    # Now create a dataframe with multicollinearity
    multicollinear_df = df.copy()
    multicollinear_df['x3'] = multicollinear_df['x1'] + multicollinear_df['x2']
    
    # Compute both correlation matrices
    corr = np.corrcoef(df, rowvar=0)
    multicollinear_corr = np.corrcoef(multicollinear_df, rowvar=0)
    
    # Compare the determinants
    print np.linalg.det(corr) . # 0.988532159861
    print np.linalg.det(multicollinear_corr) . # 2.97779797328e-16
    

    And similarly, the condition number of the covariance matrix will approach infinity with perfect linear dependence.

    print np.linalg.cond(corr) . # 1.23116253259
    print np.linalg.cond(multicollinear_corr) . # 6.19985218873e+15
    
    0 讨论(0)
  • You can detect high-multi-collinearity by inspecting the eigen values of correlation matrix. A very low eigen value shows that the data are collinear, and the corresponding eigen vector shows which variables are collinear.

    If there is no collinearity in the data, you would expect that none of the eigen values are close to zero:

    >>> xs = np.random.randn(100, 5)      # independent variables
    >>> corr = np.corrcoef(xs, rowvar=0)  # correlation matrix
    >>> w, v = np.linalg.eig(corr)        # eigen values & eigen vectors
    >>> w
    array([ 1.256 ,  1.1937,  0.7273,  0.9516,  0.8714])
    

    However, if say x[4] - 2 * x[0] - 3 * x[2] = 0, then

    >>> noise = np.random.randn(100)                      # white noise
    >>> xs[:,4] = 2 * xs[:,0] + 3 * xs[:,2] + .5 * noise  # collinearity
    >>> corr = np.corrcoef(xs, rowvar=0)
    >>> w, v = np.linalg.eig(corr)
    >>> w
    array([ 0.0083,  1.9569,  1.1687,  0.8681,  0.9981])
    

    one of the eigen values (here the very first one), is close to zero. The corresponding eigen vector is:

    >>> v[:,0]
    array([-0.4077,  0.0059, -0.5886,  0.0018,  0.6981])
    

    Ignoring almost zero coefficients, above basically says x[0], x[2] and x[4] are colinear (as expected). If one standardizes xs values and multiplies by this eigen vector, the result will hover around zero with small variance:

    >>> std_xs = (xs - xs.mean(axis=0)) / xs.std(axis=0)  # standardized values
    >>> ys = std_xs.dot(v[:,0])
    >>> ys.mean(), ys.var()
    (0, 0.0083)
    

    Note that ys.var() is basically the eigen value which was close to zero.

    So, in order to capture high multi-linearity, look at the eigen values of correlation matrix.

    0 讨论(0)
提交回复
热议问题