图像的PCA到底是怎么降维的?

来源:百度知道 编辑:UC知道 时间:2024/05/18 15:32:55
好像不是在降样本的维数?因为它不是计算样本的特征值和特征向量,而是在计算协方差矩阵的特征值和特征向量?

是的,pca保留对 covariance 贡献最大的向量.这是一种降维的方法.你当然也可以直接用样本的特征向量降维,那就是另一种方法.

SVD分解之后一样是保留最大的几个特征值,然后重新算回去就得到原来的样本了.

Dimension reduction could be simply achieved by choosing the largest N eigenvalues and their corresponding eigenvectors.

You can write a program to see the result. SVD is a build-in function in MATLAB.

Example: one d-dimensional vector -> a sample represented by d attributes
many vector = matrix M -> many samples

SVD it, M=SVD, choose only the r largest singular values of V to compose V* (other values are replaced by zero). Compute M*=SV*D

M* is the result. You will see the vetors of M* only have N non-zero values (not exactly zero, but very small)

D-dimensional vectors now have been reduced to N-dimension.