You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: EIG-0020/main.tex
+67-48Lines changed: 67 additions & 48 deletions
Original file line number
Diff line number
Diff line change
@@ -14,7 +14,7 @@
14
14
\section*{The Characteristic Equation}
15
15
\end{onlineOnly}
16
16
17
-
Let $A$ be an $n \times n$ matrix. In \href{https://ximera.osu.edu/linearalgebradzv3/LinearAlgebraInteractiveIntro/EIG-0010/main}{Describing Eigenvalues and Eigenvectors Algebraically and Geometrically} we learned that the eigenvectors and eigenvalues of $A$ are vectors $\vec{x}$ and scalars $\lambda$ that satisfy the equation
17
+
Let $A$ be an $n \times n$ matrix. In the previous section we learned that the eigenvectors and eigenvalues of $A$ are vectors $\vec{x}$ and scalars $\lambda$ that satisfy the equation
18
18
\begin{align}\label{def:eigen} A \vec{x} = \lambda\vec{x}\end{align}
19
19
We listed a few reasons why we are interested in finding eigenvalues and eigenvectors, but we did not give any process for finding them. In this section we will focus on a process which can be used for small matrices. For larger matrices, the best methods we have are iterative methods, and we will explore some of these in \href{https://ximera.osu.edu/linearalgebradzv3/LinearAlgebraInteractiveIntro/EIG-0070/main}{The Power Method and the Dominant Eigenvalue}.
The middle step was necessary before factoring because \wordChoice{\choice[correct]{we cannot subtract a $1\times1$ scalar $\lambda$ from an $n \times n$ matrix $A$} \choice{$\lambda$ is a Greek letter}}.
32
+
%The middle step was necessary before factoring because we cannot subtract a $1 \times 1$ scalar $\lambda$ from an $n \times n$ matrix $A$.
33
33
34
34
This shows that any eigenvector $\vec{x}$ of $A$ is in the \wordChoice{\choice{row space}\choice{column space}\choice[correct]{null space}} of the related matrix, $A-\lambda I$.
35
35
@@ -79,12 +79,18 @@ \subsection*{Eigenvalues}
79
79
&=-(\lambda-4)(\lambda-1)^2
80
80
\end{align*}
81
81
Matrix $C$ has eigenvalues $\lambda_1=1$ and $\lambda_2=4$.
82
+
83
+
\begin{remark}
84
+
The factor $(\lambda-1)$ appears twice. This repeated factor gives rise to the eigenvalue $\lambda_1=1$. We say that the eigenvalue $\lambda_1=1$ has \dfn{algebraic multiplicity} $2$.
85
+
\end{remark}
82
86
\end{explanation}
83
87
\end{example}
84
88
85
-
In Example \ref{ex:3x3eig}, the factor $(\lambda-1)$ appears twice. This repeated factor gives rise to the eigenvalue $\lambda_1=1$. We say that the eigenvalue $\lambda_1=1$ has \dfn{algebraic multiplicity} $2$.
89
+
\begin{definition}\label{def:algMult}
90
+
\dfn{Algebraic multiplicity} of an eigenvalue is its multiplicity as a root of the characteristic equation.
91
+
\end{definition}
86
92
87
-
The three examples above are a bit contrived. It is not always possible to completely factor the characteristic polynomial using only real numbers. However, a fundamental fact from algebra is that every degree $n$ polynomial has $n$ roots (counting multiplicity) provided that we allow complex numbers. This is why sometimes eigenvalues and their corresponding eigenvectors involve complex numbers. The next example illustrates this point.
93
+
It is not always possible to completely factor the characteristic polynomial using only real numbers. However, a fundamental fact from algebra is that every degree $n$ polynomial has $n$ roots (counting multiplicity) provided that we allow complex numbers. This is why sometimes eigenvalues and their corresponding eigenvectors involve complex numbers. The next example illustrates this point.
88
94
89
95
\begin{example}\label{ex:3x3_complex_eig}
90
96
Let $D=\begin{bmatrix} 0&0&0\\0 &1&1\\0 & -1&1\end{bmatrix}$. Compute the eigenvalues of this matrix.
@@ -137,10 +143,10 @@ \subsection*{Eigenvectors}
137
143
\begin{align}\label{eqn:nullspace}
138
144
(A-\lambda I) \vec{x}=\vec{0}
139
145
\end{align}
140
-
For any given eigenvalue $\lambda$ there are infinitely many eigenvectors associated with it. In fact, the eigenvectors associated with $\lambda$ form a subspace of $\RR^n$.
146
+
As we have seen in the previous section, a scalar multiple of an eigenvector is an eigenvector. So for any given eigenvalue $\lambda$ there are infinitely many eigenvectors associated with it. In fact, the eigenvectors associated with $\lambda$, together with the zero vector, form a subspace of $\RR^n$.
141
147
142
148
\begin{theorem}\label{th:eigenspace}
143
-
Let $A$ be an $n\times n$ matrix and let $\lambda$ be an eigenvalue of $A$. Then the set of all eigenvectors associated with $\lambda$ is a subspace of $\RR^n$.
149
+
Let $A$ be an $n\times n$ matrix and let $\lambda$ be an eigenvalue of $A$. Then the set of all eigenvectors associated with $\lambda$, together with the zero vector, is a subspace of $\RR^n$.
144
150
\end{theorem}
145
151
\begin{proof}
146
152
See Practice Problems \ref{prob:eigenspace1} and \ref{prob:eigenspace2}.
@@ -149,10 +155,10 @@ \subsection*{Eigenvectors}
149
155
This motivates the following definition.
150
156
151
157
\begin{definition}\label{def:eigspace}
152
-
The set of all eigenvectors associated with a given eigenvalue of a matrix is known as the \dfn{eigenspace} associated with that eigenvalue.
158
+
The set of all eigenvectors associated with a given eigenvalue of a matrix, together with the zero vector, is known as the \dfn{eigenspace} associated with that eigenvalue.
153
159
\end{definition}
154
160
155
-
So given an eigenvalue $\lambda$, there is an associated eigenspace $\mathcal{S}$, and our goal is to find a basis of $\mathcal{S}$, for then any eigenvector $\vec{x}$ will be a linear combination of the vectors in that basis. Moreover, we are trying to find a basis for the set of vectors that satisfy Equation \ref{eqn:nullspace}, which means we seek a basis for $\mbox{null}(A-\lambda I)$. We have already learned how to compute a basis of a null space - see \href{https://ximera.osu.edu/linearalgebradzv3/LinearAlgebraInteractiveIntro/VSP-0040/main}{Subspaces Associated with Matrices}.
161
+
Given an eigenvalue $\lambda$, there is an associated eigenspace $\mathcal{S}_{\lambda}$. Our goal is to find a basis of $\mathcal{S}_{\lambda}$. Then any eigenvector $\vec{x}$associated with $\lambda$will be a linear combination of the vectors in that basis. In seeking a basis for $\mathcal{S}_{\lambda}$, we are trying to find a basis for the set of vectors that satisfy $(A-\lambda I) \vec{x}=\vec{0}$, which means we seek a basis for $\mbox{null}(A-\lambda I)$. % We have already learned how to compute a basis of a null space - see \href{https://ximera.osu.edu/linearalgebradzv3/LinearAlgebraInteractiveIntro/VSP-0040/main}{Subspaces Associated with Matrices}.
156
162
157
163
Let's return to the examples we did in the first part of this section.
Vectors in the null space have the form $\begin{bmatrix}1\\1\end{bmatrix}t$ This means that $\left\{\begin{bmatrix}1\\1\end{bmatrix}\right\}$ is one possible basis for the eigenspace $\mathcal{S}_3$.
181
+
182
+
These computations reinforce the geometric insights into this problem that we gained in the previous section.
183
+
184
+
\begin{onlineOnly}
185
+
\begin{center}
186
+
\geogebra{yr2btbqj}{800}{600}
187
+
\end{center}
188
+
\end{onlineOnly}
175
189
\end{explanation}
176
190
\end{example}
177
191
@@ -185,14 +199,10 @@ \subsection*{Eigenvectors}
185
199
From this we see that an eigenvector in $\mathcal{S}_0$ has the form $\begin{bmatrix}-1/2\\1\end{bmatrix}t$. %$\begin{bmatrix}x_1\\x_2\end{bmatrix}$ in $\mathcal{S}_0$ satisfies $x_1+\frac{1}{2} x_2=0$, so that $2x_1=-x_2$.
186
200
This means that $\left\{\begin{bmatrix}-1/2\\1\end{bmatrix}\right\}$ is one possible basis for the eigenspace $\mathcal{S}_0$. By letting $t=-2$, we obtain an arguably nicer-looking basis: $\left\{\begin{bmatrix}1\\-2\end{bmatrix}\right\}$.
187
201
188
-
See if you can compute a basis for $\mathcal{S}_4$.
189
202
190
-
Click on the arrow if you need help.
191
-
192
-
\begin{expandable}{}{}
193
203
To compute a basis for $\mathcal{S}_4$, the subspace of all eigenvectors associated to the eigenvalue $\lambda_2=4$, we compute:
From this we find that $\left\{\begin{bmatrix}1\\\answer{2}\end{bmatrix}\right\}$ is one possible basis for the eigenspace $\mathcal{S}_4$.
198
208
\end{explanation}
@@ -216,9 +226,18 @@ \subsection*{Eigenvectors}
216
226
\end{align*}
217
227
This time there is one free variable. %Setting $x_3=t$, we also get $x_1=t$ and $x_2=t$. From this we see
218
228
The eigenvectors in $\mathcal{S}_4$ have the form $\begin{bmatrix}t\\t\\t\end{bmatrix}$, so a possible basis for the eigenspace $\mathcal{S}_4$ is given by $\left\{\begin{bmatrix}1\\1\\1\end{bmatrix}\right\}$.
229
+
230
+
\begin{remark}
231
+
Recall that $\lambda_1=1$ has algebraic multiplicity $2$. Note that $\text{dim}(\mathcal{S}_{\lambda_1})=2$. We say that $\lambda_1$ has geometric multiplicity $2$. We have to be careful, however, not to assume that geometric and algebraic multiplicity are always the same. We will revisit this issue later in the text.
232
+
\end{remark}
233
+
219
234
\end{explanation}
220
235
\end{example}
221
236
237
+
\begin{definition}\label{def:geomMult}
238
+
The \dfn{geometric multiplicity} of an eigenvalue is the dimension of the eigenspace corresponding to it.
239
+
\end{definition}
240
+
222
241
\begin{example}\label{ex:3x3_complex_ev} (Finding eigenvectors for Example \ref{ex:3x3_complex_eig})
223
242
We know from Example \ref{ex:3x3_complex_eig} that $D=\begin{bmatrix} 0&0&0\\0 &1&1\\0 & -1&1\end{bmatrix}$ has eigenvalues $\lambda=0$, $\lambda_1=1+i$, and $\lambda_2=1-i$. Compute a basis for the eigenspace associated with each eigenvalue.
224
243
\begin{explanation}
@@ -243,39 +262,22 @@ \subsection*{Eigenvectors}
243
262
\end{theorem}
244
263
245
264
\begin{proof}
246
-
A square matrix $A$ is singular if and only if $\det{A}=0$.(see \ref{th:detofsingularmatrix}). But $\det{A}=0$ if and only if $\det{A-0I}=0$, which is true if and only if zero is an eigenvalue of $A$.
265
+
A square matrix $A$ is singular if and only if $\det{A}=0$. But $\det{A}=0$ if and only if $\det{A-0I}=0$, which is true if and only if zero is an eigenvalue of $A$.
In this exercise we will prove that the eigenvectors associated with an eigenvalue $\lambda$ of an $n \times n$ matrix $A$ form a subspace of $\RR^n$.
254
-
255
-
\begin{problem}\label{prob:eigenspace1}
256
-
Let $\vec{x}$ and $\vec{y}$ be eigenvectors of $A$ associated with $\lambda$. Show that $\vec{x}+\vec{y}$ is also an eigenvector of $A$ associated with $\lambda$. (This shows that the set of eigenvectors of $A$ associated with $\lambda$ is closed under addition).
257
-
\end{problem}
258
-
259
-
\begin{problem}\label{prob:eigenspace2}
260
-
Show that the set of eigenvectors of $A$ associated with $\lambda$ is closed under scalar multiplication.
A basis for $\mathcal{S}_{\lambda_1}$ is $\left\{\begin{bmatrix}\answer{-1/8}\\1\end{bmatrix}\right\}$. A basis for $\mathcal{S}_{\lambda_2}$ is $\left\{\begin{bmatrix}\answer{1}\\1\end{bmatrix}\right\}$.
275
277
\end{problem}
276
278
277
-
\begin{problem}\label{prob:eigenspace4}
278
-
$$\begin{bmatrix}1&-2\\2&1\end{bmatrix}$$
279
+
\begin{problem}\label{prob:eigenspace4}Compute the eigenvalues of $A$, and find the corresponding eigenspaces.
Let $T=\begin{bmatrix} 1 & 2 & 3\\0 & 5 & 6\\0 & 0 & 9\end{bmatrix}$. Compute a basis for each of the eigenspaces of this matrix, $\mathcal{S}_1$, $\mathcal{S}_5$, and $\mathcal{S}_9$.
Prove Theorem \ref{th:eigtri}. (HINT: Proceed by induction on the dimension n. For the inductive step, compute $\det(A-\lambda I)$ by expanding along the first column (or row) if $T$ is upper (lower) triangular.)
326
+
Prove Theorem \ref{th:eigtri}.
327
+
\begin{hint}
328
+
What do we know about the determinant of a triangular matrix?
329
+
\end{hint}
330
+
331
+
%(HINT: Proceed by induction on the dimension n. For the inductive step, compute $\det(A-\lambda I)$ by expanding along the first column (or row) if $T$ is upper (lower) triangular.)
The following set of problems deals with geometric interpretation of eigenvalues and eigenvectors, as well as linear transformations of the plane. Please use \href{https://ximera.osu.edu/linearalgebradzv3/LinearAlgebraInteractiveIntro/EIG-0010/main}{Describing Eigenvalues and Eigenvectors Algebraically and Geometrically} and \href{https://ximera.osu.edu/linearalgebradzv3/LinearAlgebraInteractiveIntro/LTR-0070/main}{Geometric Transformations of the Plane} for reference.
336
+
%The following set of problems deals with geometric interpretation of eigenvalues and eigenvectors, as well as linear transformations of the plane. Please use \href{https://ximera.osu.edu/linearalgebradzv3/LinearAlgebraInteractiveIntro/EIG-0010/main}{Describing Eigenvalues and Eigenvectors Algebraically and Geometrically} and \href{https://ximera.osu.edu/linearalgebradzv3/LinearAlgebraInteractiveIntro/LTR-0070/main}{Geometric Transformations of the Plane} for reference.
330
337
331
338
\begin{problem}\label{prob:eigvectorstransfr2_1}
332
339
Recall that a vertical stretch/compression of the plane is a linear transformation whose standard matrix is $$M_v=\begin{bmatrix}1&0\\0&k\end{bmatrix}$$
Answer: A basis for $\mathcal{S}$ is $\left\{\begin{bmatrix}1\\\answer{0}\end{bmatrix}\right\}$
350
357
351
-
Sketch several vectors in the eigenspace and use geometry to explain why the eigenvectors you sketched make sense.
358
+
What is the algebraic multiplicity of $\lambda$? $\answer{2}$
359
+
360
+
What is the geometric multiplicity of $\lambda$? $\answer{1}$
361
+
352
362
\end{problem}
353
363
354
364
\begin{problem}\label{prob:rotmatrixrealeig2}
355
365
Recall that a counterclockwise rotation of the plane through angle $\theta$ is a linear transformation whose standard matrix is $$M_{\theta}=\begin{bmatrix}\cos\theta&-\sin\theta\\\sin\theta&\cos\theta\end{bmatrix}$$
356
366
Verify that the eigenvalues of $M_{\theta}$ are
357
367
$$\lambda=\cos\theta\pm\sqrt{\cos^2\theta-1}$$
358
-
Explain why $\lambda$ is a real number if and only if $\theta$ is a multiple of $\pi$. (Compare this to Practice Problem \ref{prob:rotmatrixrealeig1} of \href{https://ximera.osu.edu/linearalgebradzv3/LinearAlgebraInteractiveIntro/EIG-0010/main}{Describing Eigenvalues and Eigenvectors Algebraically and Geometrically}.)
368
+
Use algebra, then use geometry to explain why $\lambda$ is a real number if and only if $\theta$ is a multiple of $\pi$.
359
369
360
370
Suppose $\theta$ is a muliple of $\pi$. Then the eigenspaces corresponding to the two eigenvalues are the same. Which of the following describes the eigenspace?
361
371
\begin{multipleChoice}
@@ -401,6 +411,15 @@ \section*{Practice Problems}
401
411
402
412
\end{problem}
403
413
414
+
\begin{problem}\label{prob:eigenspace1}
415
+
In this exercise we will prove that the eigenvectors associated with an eigenvalue $\lambda$ of an $n \times n$ matrix $A$, together with the zero vector, form a subspace of $\RR^n$. To do this, follow the outline below.
416
+
417
+
\begin{enumerate}
418
+
\item
419
+
Let $\vec{x}$ and $\vec{y}$ be eigenvectors of $A$ associated with $\lambda$. Show that $\vec{x}+\vec{y}$ is also an eigenvector of $A$ associated with $\lambda$. (This shows that the set of eigenvectors of $A$ associated with $\lambda$ is closed under addition).
420
+
\item Show that the set of eigenvectors of $A$ associated with $\lambda$ is closed under scalar multiplication.
421
+
\end{enumerate}
422
+
\end{problem}
404
423
405
424
\section*{Exercise Source}
406
425
Practice Problem \ref{prob:3x3fromKuttler1} is adopted from Problem 7.1.11 of Ken Kuttler's \href{https://open.umn.edu/opentextbooks/textbooks/a-first-course-in-linear-algebra-2017}{\it A First Course in Linear Algebra}. (CC-BY)
0 commit comments