how to generate pseudo-random positive definite matrix with constraints on the off-diagonal elements?

后端 未结 4 1383
误落风尘
误落风尘 2021-01-13 12:24

The user wants to impose a unique, non-trivial, upper/lower bound on the correlation between every pair of variable in a var/covar matrix.

For example: I want a vari

相关标签:
4条回答
  • 2021-01-13 12:35

    Here is your response to my answer in the original thread:

    "Come on people, there must be something simpler"

    I'm sorry, but there is not. Wanting to win the lottery is not enough. Demanding that the Cubs win the series is not enough. Nor can you just demand a solution to a mathematical problem and suddenly find it is easy.

    The problem of generating pseudo-random deviates with sample parameters in a specified range is non-trivial, at least if the deviates are to be truly pseudo-random in any sense. Depending on the range, one may be lucky. I suggested a rejection scheme, but also stated it was not likely to be a good solution. If there are many dimensions and tight ranges on the correlations, then the probability of success is poor. Also important is the sample size, as that will drive the sample variance of the resulting correlations.

    If you truly want a solution, you need to sit down and specify your goal, clearly and exactly. Do you want a random sample with a nominal specified correlation structure, but strict bounds on the correlations? Will any sample correlation matrix that satisfies the bound on the aims be satisfactory? Are the variances also given?

    0 讨论(0)
  • 2021-01-13 12:36

    You can create a set of N random vectors of size M and unit variance. And add to them a random vector (size N and unit variance) multiplied by a certain number k. Then you take the correlation between all those vectors, that will be a positive definite matrix. If M is very big then there will be no variance in the correlation distribution and the correlation will be: k^2/(1+k^2). The smaller M gets the wider the distribution of the off diagonal elements. Alternatively, you can let M be very large and multiply the "common vector" by a different k each. You might get tighter control if you play with these parameters properly. Here goes some Matlab code to do that:

    clear all;
    vecLarg=10;
    theDim=1000;
    corrDist=0*randn(theDim,1);
    Baux=randn(vecLarg,theDim)+  (corrDist*randn(1,vecLarg))'+(k*ones(theDim,1)*randn(1,vecLarg))'  ;
    A=corrcoef(Baux);
    hist(A(:),100);
    
    0 讨论(0)
  • 2021-01-13 12:37

    Perhaps this answer will help operationalize it:

    One class of matrixes that has this property of non-negative definiteness is the Wishart Distribution. And samples from ~W() such that all non-diagonal entries are between some bounds [l,u] would fit your question. However, I don't believe this is the same as the distribution of all positive-definite matrices with non-diagonals in [l,u].

    On the wikipedia page there is an algorithm for calculating from ~W().

    A simpler, hackish solution (possibly approximating this) is to:

    (given that u>l and l>0)

    1. draw from a multivariate normal where Sigma = mean(l,u).
    2. Then taking the sample, calculating its correlation matrix => C
    3. This matrix will have some randomness (fuzz), but the math of how much fuzz it will have is a little out of my my league to calculate. The values of the off-diags in this C matrix are bounded by [-1,1], with mean of mean(l,u). By eyeball, I'm guessing some sort of beta/exponential. In any case, that continuous distribution of the off diags in the C guarantees it won't behave and lie inside the bounds (l,u), unless (l,u) = [-1,1].
    4. You can adjust the amount of "fuzz" by increasing / decreasing the length of the sample in step 1. I'd wager (unproven) that the amount of variance in C's odd-diags is proportional to the square-root of the number of samples.

    So this seems non-trivial to truly answer!

    As other posters have suggested, you can generate from Wishart, then keep the ones where the property you want is true, but you might be sampling for a long time! If you exclude those who are 0-definite (is that a word?) then this should work fine for generating good matrices. However this is not the true distribution of all pos-def matrices whose off-diags are in [l,u].

    Code (in R) for dumb-sampling scheme proposed above

    sigma1 <- function(n,sigma) {
        out <- matrix(sigma,n,n)
        diag(out) <- 1
        return (out)
    }
    
    library(mvtnorm)
    sample_around_sigma <- function(size, upper,lower, tight=500) {
        #  size:  size of matrix
        #  upper, lower:  bounds on the corr, should be > 0
        #  tight:  number of samples to use.  ideally this
        #     would be calcuated such that the odd-diags will
        #     be "pretty likely" to fall in [lower,upper]
        sigma <- sigma1(size,mean(c(upper,lower)))
        means <- 0*1:size
        samples <- rmvnorm(n=tight, mean=means,sigma=sigma)
        return (cor(samples))
    }
    
    > A <- sample_around_sigma(5, .3,.5)
    > A
              [,1]      [,2]      [,3]      [,4]      [,5]
    [1,] 1.0000000 0.3806354 0.3878336 0.3926565 0.4080125
    [2,] 0.3806354 1.0000000 0.4028188 0.4366342 0.3801593
    [3,] 0.3878336 0.4028188 1.0000000 0.4085453 0.3814716
    [4,] 0.3926565 0.4366342 0.4085453 1.0000000 0.3677547
    [5,] 0.4080125 0.3801593 0.3814716 0.3677547 1.0000000
    > 
    > summary(A[lower.tri(A)]); var(A[lower.tri(A)])
       Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
     0.3678  0.3808  0.3902  0.3947  0.4067  0.4366 
    [1] 0.0003949876
    
    0 讨论(0)
  • 2021-01-13 12:53

    OK, fantastic Gregg: we're getting somewhere. Combining your idea with that of woodchips, yields this alternative approach. It's mathematically very dirty but it seems to work:

    library(MCMCpack)
    library(MASS)
    p<-10
    lb<-.6
    ub<-.8
    zupa<-function(theta){
        ac<-matrix(theta,p,p)
        fe<-rwish(100*p,ac%*%t(ac))
        det(fe)
    }
    ba<-optim(runif(p^2,-10,-5),zupa,control=list(maxit=10))
    ac<-matrix(ba$par,p,p)
    fe<-rwish(100*p,ac%*%t(ac))
    me<-mvrnorm(p+1,rep(0,p),fe)
    A<-cor(me)
    bofi<-sqrt(diag(var(me)))%*%t(sqrt((diag(var(me)))))
    va<-A[lower.tri(A)]
    l1=100
    while(l1>0){
        r1<-which(va>ub)
        l1<-length(r1)
        va[r1]<-va[r1]*.9
    }
    A[lower.tri(A)]<-va
    A[upper.tri(A)]<-va
    vari<-bofi*A
    mk<-mvrnorm(10*p,rep(0,p),vari)
    pc<-sign(runif(p,-1,1))
    mf<-sweep(mk,2,pc,"*")
    B<-cor(mf)
    summary(abs(B[lower.tri(B)]))
    

    Basically, this is the idea (say upper bound =.8 and lower it bound=.6), it has a good enough acceptance rate, which is not 100%, but it'll do at this stage of the project.

    0 讨论(0)
提交回复
热议问题