Textural displays – for rendering bounded-range data – Why not vibrations?



Textural displays – for rendering bounded-range data – Why not vibrations?

0 0


ubc-matrix-decomposition


On Github mikepb / ubc-matrix-decomposition

Matrix Decomposition

approximation via randomization

by michael phan-ba / mikepb@cs.ubc.ca

What is matrix decomposition?

A ≈ B C m × n m × k k × n

General idea

  • Large matrices often have low-rank
  • A much smaller matrix can describe most of the action
  • Classical factorization on small matrices is fast

How much data?

Today's data sets are enormous.

Randomization speeds up deterministic algorithms.

Example: QR factorization

A ≈ Q R m × n m × k k × n Deterministic: \(O(kmn)\) flops Randomized: \(O\left(mn log(k) + (m + n)k^2\right)\) flops

\(k \leq \min\{m,n\}\) means randomization is faster

Randomization is more efficient.

“For a matrix that is too large to fit in fast memory, the randomized techniques require only a constant number of passes over the data, as opposed to \(O(k)\) passes for classical algorithms.” —Halko et al

Prototype algorithm

Sample the input matrix Use classical methods to compute final factorization

Approximating the range

$$\large\|\mathbf{A} - \mathbf{A\Omega}\| \leq \epsilon$$

Construct \(\mathbf{Y} = \mathbf{A\Omega}\) such that \(\mathbf{Y}\) captures most of the action of \(\mathbf{A}\).

Approximating the range

Approximating the range

Approximating the range

Capture action of \(\mathbf{A}\)

  • Reduce the size of \(\mathbf{A}\) to \(\mathbf{Y}\) while approximating the range
  • Lower rank matrices compress to smaller output matrix

Algorithms forapproximating the range

Gaussian Johnson-Lindenstrauss    \(O(kmn)\) Fast Johnson-Lindenstrauss     \(O(mn\log k)\) Sparse Johnson-Lindenstrauss
  • Variants of JL

Factorization

$$\large \mathbf{A} \approx \mathbf{Y} = \mathbf{PX}$$

Construct low-rank matrices \(\mathbf{Q}\) and \(\mathbf{R}\) such that \(\mathbf{A} \approx \mathbf{PX}\).

Interpolative decomposition

  • Halko et al also called row extraction

Interpolative decomposition

Interpolative decomposition

Select rows of \(\mathbf{A}\)

Interpolative decomposition

  • ID outputs matrices \(\mathbf{Q}\) and \(\mathbf{X}\)
  • Postprocess such that \(\mathbf{A} \approx \mathbf{U\Sigma V}^*\)

Other factorization algorithms

  • QR factorization
  • SVD factorization
  • Interpolated decomposition

Numerical example: Eigenfaces

Numerical example:Interpolative decomposition simulation

  • Cray XMT supercomputer
  • Random Gaussian input

Randomized matrix decomposition

Sample the input matrix Use classical methods to compute final factorization

References

  • N. Halko, P.-G. Martinsson, and J. A. Tropp, “Finding structure with randomness: Probabilistic algorithms for constructing Ω: approximate matrix decompositions,” SIAM review, vol. 53, no. 2, pp. 217–288, 2011.
  • P.-G. Martinsson, V. Rokhlin, and M. Tygert, “A randomized algorithm for the decomposition of matrices,” Applied and Computational Harmonic Analysis, vol. 30, no. 1, pp. 47–68, 2011.
  • N. Ailon and B. Chazelle, “The fast Johnson-Lindenstrauss transform and approximate nearest neighbors,” SIAM Journal on Computing, vol. 39, no. 1, pp. 302–322, 2009.
  • A. Dasgupta, R. Kumar, and T. Sarlo ́s, “A sparse Johnson-Lindenstrauss transform,” in Proceedings of the 42nd ACM Symposium on Theory of Computing. ACM, 2010, pp. 341–350.

Q.E.D. ∎

mikepb.com