low rank models and projections
description
projecting data points:
- Given a data matrix X with m columns (each column is a data point), each column xj is projected to a k-dimensional vector hj.
- This projection is done by multiplying xj with the transpose of the product of the k largest singular values and the k right singular vectors of X.
- Project each data point xj to a k-dimensional vector hj: hjβ=VkTβxjβ
- with matrix X, use SVD to find rank-k approximation Xβ = UH
- U: matrix of k orthogonal vectors
- H: matrix of k coefficients
- each data point (column vector)
- each feature (row vector)