Section 6.3 Computing eigenspaces
We have already defined eigenspaces in Definition 6.1.7
Suppose we are given a square matrix \(A\) of order \(n\text{,}\) a real number \(\lambda\text{,}\) and we want to find all vectors in
If \(A\vec x=\lambda\vec x\text{,}\) then \(A\vec x-\lambda\vec x=\vec 0\) and \((A-\lambda I)\vec x=\vec 0\text{.}\) Hence we need only solve a system of homogeneous equations.
From the previous Example 6.1.3,
has eigenvalues \(\lambda=2,4,6\text{.}\) We find the eigenspaces for each eigenvalue.
Example 6.3.1. \(\lambda=2\).
As usual, we put the augmented matrix into reduced row echelon form:
and so all solutions are of the form \((x,y,z)=(t,t,t)=t(1,1,1)\text{.}\)
Example 6.3.2. \(\lambda=4\).
and so all solutions are of the form \((x,y,z)=(t,-t,t)=t(1,-1,1)\text{.}\)
Example 6.3.3. \(\lambda=6\).
and so all solutions are of the form \((x,y,z)=(-t,-t,t)=t(-1,-1,1)\text{.}\)
Notice that setting \(t=1\) in each case gives us the the original eigenvectors of the example.
We can use similar arguments for Example 6.1.4, in which \(A=\begin{bmatrix} 2\amp 1\amp 4\\ 0\amp 3\amp 0\\ 2\amp -2\amp -5 \end{bmatrix}\) and \(\lambda=-6,3\text{:}\)
Example 6.3.4. \(\lambda=-6\).
and so all eigenvectors are of the form \((x,y,z)=t(1,0,-2)\text{.}\)
Example 6.3.5. \(\lambda=3\).
and so all eigenvectors are of the form \((x,y,z)=t(1,1,0)+u(4,0,1)\text{.}\)
Proposition 6.3.6. Eigenvalues and the powers of a matrix.
If \(A\vec x=\lambda \vec x\) then \(A^n\vec x=\lambda^n \vec x\) for \(n=1,2,\ldots\text{.}\)
Proof.
Repeating this process yields the desired result.
Proposition 6.3.7. Eigenspaces and the powers of a matrix.
Let \(A\) be an \(n\times n\) matrix having \(\vec x_1,\ldots,\vec x_m\) as eigenvectors with \(\lambda_1,\ldots,\lambda_m\) as corresponding eigenvalues. Further, let \(\vec x\in \Span\{\vec x_1,\ldots,\vec x_m\}\text{.}\) Then, for some \(r_1,\ldots,r_n\text{,}\)
\(A\vec x=\sum_{i=1}^m r_i\lambda_i\vec x_i\text{,}\) and
\(A^n\vec x=\sum_{i=1}^m r_i\lambda_i^n\vec x_i\) for \(n\ge1\text{.}\)
Proof.
By definition of the span of a set,
and so
Similarly,
Example 6.3.8. Eigenspaces and the powers of a matrix.
Let \(A=\begin{bmatrix}0\amp1\amp1\\1\amp0\amp1\\1\amp1\amp0 \end{bmatrix}\text{,}\) \(\vec x_1=\begin{bmatrix}1\\1\\1\end{bmatrix}\text{,}\) \(\vec x_2=\begin{bmatrix}-1\\1\\0\end{bmatrix}\text{,}\) and \(\vec x_3=\begin{bmatrix}-1\\0\\1\end{bmatrix}\text{.}\) Then \(\vec x_1\text{,}\) \(\vec x_2\text{,}\) and \(\vec x_3\text{,}\) are eigenvectors of \(A\) with corresponding eigenvalues \(2\text{,}\) \(-1\text{,}\) and \(-1\text{.}\) In fact \(\{\vec x_1, \vec x_2, \vec x_3\}\) is a basis for \(\R^3\text{.}\) From the definition of a basis, for any given \(\vec x\in\R^3\text{,}\) there is a unique choice of \(r_1,r_2,r_3\) so that \(\vec x=r_1\vec x_1 + r_2\vec x_2 + r_3\vec x_3\text{.}\) From the previous proposition,
As \(n\) gets large, the coefficient of \(x_1\) becomes huge and the value of \(A^n\vec x\) is very close to a scalar multiple of \(\vec x_1\text{,}\) that is, it approaches the eigenspace \(E_2\text{.}\)