Matrices with off-diagonal decay appear in a variety of fields in mathematics and
in numerous applications, such as signal processing, statistics, communications
engineering, condensed matter physics, and quantum chemistry. Numerical algorithms dealing
with such matrices often take advantage (implicitly or explicitly) of the empirical
observation that this off-diagonal decay property seems to be preserved when computing
various useful matrix factorizations, such as the Cholesky factorization or the
QR-factorization. There is a fairly extensive theory describing when the inverse of a
matrix inherits the localization properties of the original matrix. Yet, except for the
special case of band matrices, surprisingly very little theory exists that would establish
similar results for matrix factorizations. We will derive a comprehensive framework to
rigorously answer the question when and under which conditions the matrix factors inherit
the localization of the original matrix for such fundamental matrix factorizations as the
LU-, QR-, Cholesky, and Polar factorization.