homotopy theory, (∞,1)-category theory, homotopy type theory
flavors: stable, equivariant, rational, p-adic, proper, geometric, cohesive, directed…
models: topological, simplicial, localic, …
see also algebraic topology
Introductions
Definitions
Paths and cylinders
Homotopy groups
Basic facts
Theorems
The Gram–Schmidt process is an algorithm which takes as input an ordered basis of an inner product space and produces as output an ordered orthonormal basis.
In terms of matrices, the Gram–Schmidt process is a procedure of factorization of a invertible matrix $M$ in the general linear group $GL_n(\mathbb{R})$ (or $GL_n(\mathbb{C})$) as a product $M = U T$ where $T$ is an upper triangular matrix and $U$ is an orthonormal (or unitary) matrix; as such it is a special case of the more general Iwasawa decomposition for a (connected) semisimple Lie group.
Since the factorization depends smoothly on the parameters, the Gram–Schmidt procedure enables the reduction of the structure group of an inner product vector bundle (e.g., the tangent bundle of a Riemannian manifold or a Kähler manifold) from $GL_n$ to the orthogonal group $O_n$ (or the unitary group $U_n$).
In this section, “basis” is understood to signify an ordered independent set whose linear span is dense in a Hilbert space $H$ seen as a metric space. We will describe the Gram–Schmidt process as applied to a $d$-dimensional Hilbert space for some cardinal $d$ with a basis $v_0, v_1, \ldots$ consisting of $d$ vectors.
The orthonormal basis $u_0, u_1, \ldots$ produced as output is defined recursively by a) subtracting the orthogonal projection to the closed subspace generated by all previous vectors and b) normalizing. We denote the orthogonal projection onto a closed subspace $A$ by $\pi_A\colon H\to A$ and the normalization $v/\|v\|$ of a vector $v \in H$ by $N(v)$. For ordinals $\alpha \lt d$ define
where $S_\alpha$ is the closure of the span of $\{v_\beta: \beta \lt \alpha\}$, noting that the projection $\pi_{S_\alpha}$ is known to exist, since $H$ is complete. This can be rewritten more explicitly using transfinite recursion as
where the sum on the right is well defined by the Bessel inequality, i.e. only countably many coefficients are non-zero and they are square-summable. A simple (transfinite) inductive argument shows that the $u_\alpha$ are unit vectors orthogonal to each other, and that the span of $\left\{u_\beta \colon \beta \lt \alpha\right\}$ is equal to the span of $\left\{v_\beta \colon \beta \lt \alpha\right\}$ for $\alpha \leq d$. Therefore $u_0, u_1, \ldots$ is an orthonormal basis of $H$.
(Application to non-bases)
If we apply the Gram–Schmidt process to a well-ordered independent set whose closed linear span $S$ is not all of $H$, we still get an orthonormal basis of the subspace $S$. If we apply the Gram–Schmidt process to a dependent set, then we will eventually run into a vector $v$ whose norm is zero, so we will not be able to take $N(v)$. In that case, however, we can simply remove $v$ from the set and continue; then we will still get an orthonormal basis of the closed linear span. (This conclusion is not generally valid in constructive mathematics, since it relies on excluded middle applied to the statement that $\|v\| \neq 0$. However, it does work to discrete fields, such as the algebraic closure of the rationals, as seen in elementary undergraduate linear algebra.)
There is an alternative algorithm via Gaussian elimination which for a set of vectors produces an orthogonal basis for its linear span (Pursell-Trimble 91).
In the special case that the original vectors are linearly independent and after normalizing the resulting orthogonal basis to an orthonormal basis, the output of this algorithm is an orthonormal basis as produced also by the Gram-Schmidt process.
(LU-decomposition of positive semidefinite matrices)
Every positive semidefinite matrix $M$ has a matrix product-decomposition
where
$L$ is
$U$ is an upper triangular matrix.
(orthogonal basis of linear span via LU-decomposition)
Let $(a_j)$ be a tuple of vectors of the same length, hence let
be the matrix whose $j$th column is $a_j$.
Since the matrix product $M \coloneqq A^T \cdot A$ of $A$ with its transpose matrix is necessarily a positive semidefinite symmetric matrix it has an LU-decomposition according to Lemma :
Then the column vectors $(\hat q_j)$ of the matrix
constitute an orthogonalization of the original tuple of vectors $(a_i)$ in that the non-zero $\hat q_j$ form an orthogonal linear basis of the linear span of the $(a_j)$.
(Pursell-Trimble 91, top of p. 5)
That the matrix $L$ in Prop. is lower triangular and invertible (by Lemma ) means that its inverse matrix $L^{-1}$ is also a lower triangular matrix which reflects a process of row reduction from $A^T \cdot A$ to $U$.
Accordingly, the orthogonalization in Prop. may be understood as applying Gauss elimination to
A classic illustration of Gram–Schmidt is the production of the Legendre polynomials.
Let $H$ be the Hilbert space $H = L^2([-1, 1])$ of square integrable functions in the closed interval, equipped with the standard inner product defined by
By the Stone-Weierstrass theorem, the space of polynomials $\mathbb{C}[x]$ is dense in $H$ according to its standard inclusion, and so the polynomials $1, x, x^2, \ldots$ form an ordered basis of $H$.
Applying the Gram–Schmidt process as above, one readily computes the first few orthonormal functions:
The classical Legendre polynomials $P_n(x)$ are scalar multiplies of the functions $u_n$, adjusted so that $P_n(1) = 1$; they satisfy the orthogonality relations
where $\delta_{m, n}$ is the Kronecker delta.
Many aspects of the Gram–Schmidt process can be categorified so as to apply to 2-Hilbert spaces as indicated at Schur's lemma in the section In terms of categorical algebra;
We will illustrate the basic idea with an example that was suggested to us by James Dolan. (See also at permutation representation the sections Examples – Virtual permutation representations.)
Consider the category of representations $S_n Rep$ over the complex numbers of the symmetric group $G \coloneqq S_n$. (As a running example, we consider $S_4$; up to isomorphism, there are five irreducible representations
classified by the five Young diagrams of size 4. To save space, we denote these as $U_1$, $U_2$, $U_3$, $U_4$, $U_5$.)
The irreducible representations $U_i$ of $S_n$ form a $2$-orthonormal basis in the sense that any two of them $U_i, U_j$ satisfy the relation
(where $hom_G$ denotes the hom vector space in $S_n Rep$, and $n \cdot \mathbb{C}$ indicates a direct sum of $n$ copies of $\mathbb{C}$). In fact, the irreducible representations are uniquely determined up to isomorphism by these relations.
There is however another way of associating representations to partitions or Young diagrams. Namely, consider the subgroup of permutations which take each row of a Young diagram or Young tableau of size $n$ to itself; this forms a parabolic subgroup of $S_n$, conjugate to one of type $P_{(n_1 \ldots n_k)} = S_{n_1} \times \ldots \times S_{n_k}$ where $n_i$ is the length of the $i^{th}$ row of the Young diagram. The group $S_n$ acts transitively on the orbit space of cosets
and these actions give linear permutation representations $\mathbb{C}\big[S_n/P_{(n_1 \ldots n_k)}\big]$ of $S_n$. Equivalently, these are linear representations $V_i$ which are induced from the trivial representation along inclusions of parabolic subgroups.
We claim that
these representations form a $\mathbb{Z}$-linear basis of the representation ring $R(S_n)$,
we may calculate their characters using a categorified Gram–Schmidt process.
We indicate the proof:
Given two such parabolic subgroups $P$, $Q$ in $G = S_n$, the $2$-inner product
may be identified with the free vector space on the set of double cosets $P\backslash G/Q$.
One may count the number of double cosets by hand in a simple case like $G = S_4$. That is, for the 5 representations $V_1, \ldots, V_5$ induced from the 5 parabolic subgroups $P_i$ corresponding to the 5 Young diagrams listed above, the dimensions of the 2-inner products $hom(V_i, V_j)$ are the sizes of the corresponding double coset spaces $P_i\backslash S_4 /P_j$. These numbers form a matrix as follows (following the order of the $5$ partitions listed above):
To reiterate: this matrix is the decategorification (a matrix of dimensions) of a matrix of $2$-inner products where the $(i j)$-entry is of the form
where the $V_i$ are induced from inclusions of parabolic subgroups. The $V_i$ are $\mathbb{N}$-linear combinations of irreducible representations $U_i$ which form a $2$-orthonormal basis, and we may perform a series of elementary row operations which convert this matrix into an upper triangular matrix, and which will turn out to be the decategorified form of the 2-matrix with entries
where $U_i$ is the irrep corresponding to the $i$^{th} Young diagram (as listed above). The upper triangular matrix is
and we read off from the columns the following decompositions into irreducible components:
The last representation $V_5$ is the regular representation of $S_4$ (because the parabolic subgroup is trivial). Since we know from general theory that the multiplicity of the irreducible $U_i$ in the regular representation is its dimension, we get as a by-product the dimensions of the $U_i$ from the expression for $V_5$:
(the first of the $U_i$ is the trivial representation, and the last $U_5$ is the alternating representation).
The row operations themselves can be assembled as the lower triangular matrix
and from the rows we read off the irreducible representations as “virtual” (i.e., $\mathbb{Z}$-linear) combinations of the parabolically induced representations $V_i$:
which can be considered the result of the categorified Gram–Schmidt process.
It follows from these representations that the $V_i$ form a $\mathbb{Z}$-linear basis of the representation ring $Rep(S_4)$.
Analogous statements hold for each symmetric group $S_n$.
See also
The formulation via Gaussian elimination is due to
Last revised on December 31, 2018 at 07:38:24. See the history of this page for a list of all contributions to it.