矩阵分解算法集合

时间:2021-05-21 02:05:07
原文地址:矩阵分解的Jungle,市面上的矩阵分解算法作者:loveLifeAwayWWW

矩阵分解的Jungle

2011年9月5日忙菇发表评论阅读评论

美帝的有心人士收集了市面上的矩阵分解的几乎所有算法和应用,由于源地址在某神秘物质之外,特转载过来,源地址

MatrixDecompositions has a long history andgenerally centers around a set of known factorizations such as LU,QR, SVD and eigendecompositions. More recentfactorizations have seen the light of the day with work startedwith the advent of NMF, k-means and relatedalgorithm [1]. However,with the advent of new methods based on random projections andconvex optimization that started in part inthe compressivesensing literature, we are seeing another surge of very diversealgorithms dedicated to many different kinds of matrixfactorizations with new constraints based on rank and/or positivityand/or sparsity,… As a result of this large increase in interest, Ihave decided to keep a list of them here following the success ofthe bigpicture in compressive sensing.

The sources for this list include the followingmost excellent sites: Stephen Becker’spageRaghunandan H.Keshavan‘ s pageNuclearNorm and Matrix Recovery through SDPby ChristophHelmbergArvindGanesh’s Low-Rank MatrixRecovery and Completion via ConvexOptimization who provide more in-depthadditionalinformation.  Additional codeswere featured also on NuitBlanche. The following people provided additionalinputs: OlivierGriselMatthieuPuigt.

Most of the algorithms listed below generally rely on using thenuclear norm as a proxy to the rankfunctional. It may not beoptimal. Currently, CVX MichaelGrant and Stephen Boyd) consistently allows one to exploreother proxies for the rank functional such as the log-det asfound by Maryam FazellHaithamHindiStephenBoyd** is used to show that the algorithmuses another heuristic than the nuclear norm.

In terms of notations, A refers to a matrix, L refers to a low rankmatrix, S a sparse one and N to a noisy one. This page lists thedifferent codes that implement the following matrixfactorizations: MatrixCompletion, Robust PCA ,Noisy Robust PCA, SparsePCA, NMF, DictionaryLearning, MMV, RandomizedAlgorithms and other factorizations. Some of these toolboxes cansometimes implement several of these decompositions and are listedaccordingly. Before I list algorithm here, I generally feature themon Nuit Blanche under the MF tag: http://nuit-blanche.blogspot.com/search/label/MF or. you can alsosubscribe to the Nuit Blanche feed,

Matrix Completion, A = H.*L with H a knownmask, Lunknown solvefor L lowest rankpossible

The idea of this approach is to complete the unknown coefficientsof a matrix based on the fact that the matrix is low rank:

Noisy RobustPCA,  A = L + S + Nwith L,S, N unknown, solve for Llow rank, S sparse, N noise

Robust PCA : A = L +S with L, S, N unknown, solvefor L low rank, Ssparse

Sparse PCA: A = DX with unknown D and X, solvefor sparse D

Sparse PCA onwikipedia

  • R. Jenatton, G. Obozinski, F. Bach. Structured Sparse PrincipalComponent Analysis. International Conference on ArtificialIntelligence and Statistics (AISTATS). [pdf][code]
  • SPAMs
  • DSPCA: Sparse PCA usingSDP . Code is here.
  • PathPCA: A fast greedy algorithm for Sparse PCA. The codeis here.

Dictionary Learning: A = DX with unknown D and X, solvefor sparse X

Some implementation of dictionary learning implement the NMF

NMF: A = DX with unknown D and X, solve forelements of D,X >0

Non-negativeMatrix Factorization (NMF) on wikipedia

Multiple Measurement Vector (MMV) Y = A Xwith unknownX and rowsof X are sparse.

Blind Source Separation (BSS) Y = A Xwith unknown A andX and statistical independencebetween columns of X or subspacesof columns of X

Include Independent Component Analysis (ICA), Independent SubspaceAnalysis (ISA), and Sparse Component Analysis(SCA). There are many available codes for ICA andsome for SCA. Here is a non-exhaustive list of some famous ones(which are not limited to linear instantaneous mixtures). TBC

ICA:

SCA:

Randomized Algorithms

These algorithms uses generally random projections to shrink verylarge problems into smaller ones that can be amenable totraditional matrix factorization methods.

Resource
Randomizedalgorithms for matrices and data by Michael W.Mahoney
RandomizedAlgorithms for Low-Rank Matrix Decomposition

Other factorization

D(T(.)) = L +E with unknownL, E and unknown transformationT and solvefor transformation T, Low Rank L and Noise E

Frameworks featuring advanced Matrixfactorizations

For the time being, few have integrated the most recentfactorizations.

GraphLab / Hadoop

Books

Example of use

Sources

ArvindGanesh’s Low-Rank MatrixRecovery and Completion via Convex Optimization

Relevant links

Reference:

A Unified View ofMatrix Factorization Models by Ajit P. Singh and Geoffrey J.Gordon