This is an old revision of this page, as edited by GeneralNotability (talk | contribs) at 14:19, 5 May 2019 (Reverted 1 edit by 114.124.183.9 (talk): Unexplained content removal (TW)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
Revision as of 14:19, 5 May 2019 by GeneralNotability (talk | contribs) (Reverted 1 edit by 114.124.183.9 (talk): Unexplained content removal (TW))(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)In linear algebra, the Gram matrix (Gramian matrix or Gramian) of a set of vectors in an inner product space is the Hermitian matrix of inner products, whose entries are given by .
An important application is to compute linear independence: a set of vectors are linearly independent if and only if the Gram determinant (the determinant of the Gram matrix) is non-zero.
It is named after Jørgen Pedersen Gram.
Examples
For finite-dimensional real vectors with the usual Euclidean dot product, the Gram matrix is simply (or for complex vectors using the conjugate transpose), where is a matrix whose columns are the vectors .
Most commonly, the vectors are elements of a Euclidean space, or are functions in an L space, such as continuous functions on a compact interval (which are a subspace of ).
Given real-valued functions on the interval , the Gram matrix is given by the standard inner product on functions:
For a general bilinear form on a finite-dimensional vector space over any field we can define a Gram matrix attached to a set of vectors by . The matrix will be symmetric if the bilinear form is symmetric.
Applications
- In Riemannian geometry, given an embedded -dimensional Riemannian manifold and a coordinate chart for , the volume form on induced by the embedding may be computed using the Gramian of the coordinate tangent vectors:
This generalizes the classical surface integral of a parametrized surface for :
- If the vectors are centered random variables, the Gramian is approximately proportional to the covariance matrix, with the scaling determined by the number of elements in the vector.
- In quantum chemistry, the Gram matrix of a set of basis vectors is the overlap matrix.
- In control theory (or more generally systems theory), the controllability Gramian and observability Gramian determine properties of a linear system.
- Gramian matrices arise in covariance structure model fitting (see e.g., Jamshidian and Bentler, 1993, Applied Psychological Measurement, Volume 18, pp. 79–94).
- In the finite element method, the Gram matrix arises from approximating a function from a finite dimensional space; the Gram matrix entries are then the inner products of the basis functions of the finite dimensional subspace.
- In machine learning, kernel functions are often represented as Gram matrices.
- Since the Gram matrix over the reals is a symmetric matrix, it is diagonalizable and its eigenvalues are non-negative. The diagonalization of the Gram matrix is the singular value decomposition.
Properties
Positive-semidefiniteness
The Gramian matrix is positive-semidefinite, and every positive symmetric semidefinite matrix is the Gramian matrix for some set of vectors. Further, in finite-dimensions it determines the vectors up to isomorphism, i.e. any two sets of vectors with the same Gramian matrix must be related by a single unitary matrix. These facts follow from taking the spectral decomposition of any positive-semidefinite matrix , so that and so is the Gramian matrix of the rows of . The Gramian matrix of any orthonormal basis is the identity matrix. The infinite-dimensional analog of this statement is Mercer's theorem.
Derivation of positive-semidefiniteness
The fact that the Gramian matrix is positive-semidefinite can be seen from the following simple derivation:
The first equality follows from the definition of matrix multiplication, the second and third from the bi-linearity of the inner-product, and the last from the positive definiteness of the inner product. Note that this also shows that the Gramian matrix is positive definite if and only if the vectors are linearly independent.
Change of basis
Under change of basis, to an orthogonal basis, represented by an invertible orthogonal matrix , the Gram matrix will change by a matrix congruence to .
Gram determinant
The Gram determinant or Gramian is the determinant of the Gram matrix:
Geometrically, the Gram determinant is the square of the volume of the parallelotope formed by the vectors. In particular, the vectors are linearly independent if and only if the Gram determinant is nonzero (if and only if the Gram matrix is nonsingular).
The Gram determinant can also be expressed in terms of the exterior product of vectors by
See also
References
- Horn & Johnson 2013, p. 441
Theorem 7.2.10 Let be vectors in an inner product space with inner product and let . Then
(a) is Hermitian and positive-semidefinite
(b) is positive-definite if and only if the vectors are linearly-independent.
(c) - Lanckriet, G. R. G.; Cristianini, N.; Bartlett, P.; Ghaoui, L. E.; Jordan, M. I. (2004). "Learning the kernel matrix with semidefinite programming". Journal of Machine Learning Research. 5: 27–72 .
- Horn, Roger A.; Johnson, Charles R. (2013). "7.2 Characterizations and Properties". Matrix Analysis (Second Edition). Cambridge University Press. ISBN 978-0-521-83940-2.
{{cite book}}
: Invalid|ref=harv
(help)
External links
- "Gram matrix", Encyclopedia of Mathematics, EMS Press, 2001
- Volumes of parallelograms by Frank Jones