PCA Dimensionality Reduction

和自甴很熟 提交于 2019-12-03 03:39:07

The covariance is supposed to implemented in your trainingData:

X = bsxfun(@minus, trainingData, mean(trainingData,1));           
covariancex = (X'*X)./(size(X,1)-1);                 

[V D] = eigs(covariancex, 10);   % reduce to 10 dimension

Xtest = bsxfun(@minus, test, mean(trainingData,1));  
pcatest = Xtest*V;

From your code it seems like you are taking the covariance of the labels, not the trainingData. I believe the point of PCA is in determining the greatest variance in some N (N = 10 here) number of subspaces of your data.

Your covariance matrix should be 900x900 (if 900 is the dimension of each image, a result of having 30x30 pixel images I assume.) Where the diagonal elements [i,i] of covaraincex gives the variance of that pixel for all training samples, and off diagonal [i,j] give the covariance between pixel i and pixel j. This should be a diagonal matrix as [i,j] == [j,i].

Furthermore when calling eigs(covariancex,N), N should be 10 instead of 40 if you want to reduce the dimension to 10.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!