Hi everyone!

I have a set of vectors from which I construct a covariance matrix. Then I find eigenvalues and corresponding eigenvectors (basically, I do PCA). Then I look for smallest significant eigenvalues. To do this I generate data using bootstrap. Then I construct covariance matrix and find eigenvalues for every bootstrap covariance matrix. From these eigenvalues I construct 95% confidence interval. I test smallest eigenvalue significance by looking whether it drops inside bootstrap 95% confidence interval. If so, I substract the projection from set of original vectors the projection of my set of vectors on eigenvector corresponding to significant smallest eigenvalue and repeat procedure for second smallest eigenvalue.

However, after projecting out eigenvector the next confidence interval becomes much smaller (bootstraped smallest eigenvalues decrease rapidly) and then all following smallest eigenvalues become significant. Obviously, this is because I project out significant eigenvector every time I find significan eigenvalue. How can I fight with this problem? Should I normalize somehow the projection I am substracting?

Sorry for such a long description of the problem. Thank you in advance.

Alex