(New page: The PCA diagonalizes the maximum likelihood estimate of the covariance matrix <math>C=\frac{1}{n} \sum_{i=1}^{n} \vec{x_i}{x_i}^T</math> by solving the eigenvalue equation <math>C\vec{x...)
 
Line 1: Line 1:
 +
The PCA, or Principal Component Analysis finds an orthonormal basis that best represents the data.
 
The PCA diagonalizes the maximum likelihood estimate of the covariance matrix
 
The PCA diagonalizes the maximum likelihood estimate of the covariance matrix
  
<math>C=\frac{1}{n} \sum_{i=1}^{n} \vec{x_i}{x_i}^T</math>
+
<math>C=\frac{1}{n} \sum_{i=1}^{n} \vec{x_i}\vec{x_i}^T</math>
  
 
by solving the eigenvalue equation
 
by solving the eigenvalue equation
  
<math>C\vec{x} = \lambda \vec{x}</math>
+
<math>C\vec{e} = \lambda \vec{e}</math>
 +
 
 +
The solutions to these equations are eigenvalues <math>\lambda_1 \ge \lambda_@ \ge \cdots \ge \lambda_m</math>. Often only <math>k \lt m </math> eigenvalues will have a nonzero value, meaning that the inherent dimensionality of the data is k, being n-k dimensions noise.

Revision as of 00:05, 18 April 2008

The PCA, or Principal Component Analysis finds an orthonormal basis that best represents the data. The PCA diagonalizes the maximum likelihood estimate of the covariance matrix

$ C=\frac{1}{n} \sum_{i=1}^{n} \vec{x_i}\vec{x_i}^T $

by solving the eigenvalue equation

$ C\vec{e} = \lambda \vec{e} $

The solutions to these equations are eigenvalues $ \lambda_1 \ge \lambda_@ \ge \cdots \ge \lambda_m $. Often only $ k \lt m $ eigenvalues will have a nonzero value, meaning that the inherent dimensionality of the data is k, being n-k dimensions noise.

Alumni Liaison

ECE462 Survivor

Seraj Dosenbach