|
2 | 2 |
|
3 | 3 | \newglossaryentry{inverse}
|
4 | 4 | {name={inverse matrix},
|
5 |
| - description={The\index{inverse matrix} inverse $\mA^{-1}$ of a square matrix $\mA \in \mathbb{R}^{n \times n}$ |
6 |
| - (if it exists, then $\mA$ is called invertible) is defined by $\mA \mA^{-1} = \mA^{-1} \mA = \mI$, where $\mI$ |
7 |
| - is the identity matrix. A matrix is invertible if and only if its \gls{det} is non-zero. Inverse matrices are used in |
8 |
| - solving systems of linear equations and in closed-form solutions of \gls{linreg}. |
9 |
| - Note that for non-square matrices, one may define one-sided inverses: a left inverse $\mB$ s |
10 |
| - atisfies $\mB\mA = \mI$ and a right inverse $\mC$ satisfies $\mA\mC = \mI$.\\ |
| 5 | + description={Consider a sqaure matrix $\mA \in \mathbb{R}^{n \times n}$ which has full rank, i.e., |
| 6 | + its columns are linearly indepedent. It is then invertible, i.e., there is an\index{inverse matrix} inverse matrix |
| 7 | + $\mA^{-1}$ such that $\mA \mA^{-1} = \mA^{-1} \mA = \mI$. A matrix is invertible if and only |
| 8 | + if its \gls{det} is non-zero. Inverse matrices are used in solving systems of linear equations |
| 9 | + and in closed-form solutions of \gls{linreg}. The concept of an inverse matrix can be generalized |
| 10 | + to non-invertible and even non-square matrices. Indeed, one may define a left inverse $\mB$ by |
| 11 | + requiring $\mB\mA = \mI$ or a right inverse $\mC$ that satisfies $\mA\mC = \mI$.\\ |
11 | 12 | See also: \gls{det}, \gls{linreg}.},
|
12 | 13 | first={inverse matrix},
|
13 | 14 | text={inverse matrix}
|
14 | 15 | }
|
15 | 16 |
|
16 | 17 | \newglossaryentry{det}
|
17 | 18 | {name={determinant},
|
18 |
| - description={The\index{determinant} \emph{determinant} $\det(\mA)$ of a square matrix |
| 19 | + description={The\index{determinant} determinant $\det(\mA)$ of a square matrix |
19 | 20 | $\mA \in \mathbb{R}^{n \times n}$ is a scalar that characterizes how (the orientation of)
|
20 |
| - volumes in $\mathbb{R}^n$ are altered by applying $\mA$. [Note that a matrix $\mA$ represents |
21 |
| - a linear transformation on $\mathbb{R}^{n}$.] In particular, $\det(\mA) > 0$ preserves |
22 |
| - orientation, $\det(\mA) < 0$ reverses orientation, and $\det(\mA) = 0$ collapses volume entirely, |
23 |
| - indicating that $\mA$ is non-invertible. Formally, $\det(\mA) = 0$ if and only if $\mA$ is non-invertible. |
| 21 | + volumes in $\mathbb{R}^n$ are altered by applying $\mA$ \cite{GolubVanLoanBook,Strang2007}. |
| 22 | + [Note that a matrix $\mA$ represents a linear transformation on $\mathbb{R}^{n}$.] |
| 23 | + In particular, $\det(\mA) > 0$ preserves orientation, $\det(\mA) < 0$ reverses orientation, |
| 24 | + and $\det(\mA) = 0$ collapses volume entirely, indicating that $\mA$ is non-invertible. |
24 | 25 | The determinant also satisfies $\det(\mA \mB) = \det(\mA) \cdot \det(\mB)$, and if $\mA$ is
|
25 |
| - diagonalizable with eigenvalues $\eigval{1}, \ldots, \eigval{n}$, then $\det(\mA) = \prod_{i=1}^{n} \eigval{i}$ \cite{HornMatAnalysis}. |
26 |
| - Geometrically, for the special cases $n=2$ (2D) and $n=3$ (3D), the determinant corresponds |
27 |
| - to the signed area or volume spanned by the column vectors of $\mA$. |
| 26 | + diagonalizable with \glspl{eigenvalue} $\eigval{1}, \ldots, \eigval{n}$, then $\det(\mA) = \prod_{i=1}^{n} \eigval{i}$ \cite{HornMatAnalysis}. |
| 27 | + For the special cases $n=2$ (2D) and $n=3$ (3D), the determinant can be interpreted as an oriented |
| 28 | + area or volume spanned by the column vectors of $\mA$. |
| 29 | + \begin{figure} |
| 30 | + \begin{center} |
| 31 | + \begin{tikzpicture}[x=2cm] |
| 32 | +% LEFT: Standard basis vectors and unit square |
| 33 | + \begin{scope} |
| 34 | + \draw[->, thick] (0,0) -- (1,0) node[below right] {$\vx$}; |
| 35 | + \draw[->, thick] (0,0) -- (0,1) node[above left] {$\vy$}; |
| 36 | +% \draw[fill=gray!15] (0,0) -- (1,0) -- (1,1) -- (0,1) -- cycle; |
| 37 | + %\node at (0.5,0.5) {\small unit square}; |
| 38 | + %\node at (0.5,-0.6) {standard basis}; |
| 39 | + \end{scope} |
| 40 | +% RIGHT: Transformed basis vectors and parallelogram |
| 41 | + \begin{scope}[shift={(2.8,0)}] |
| 42 | + \coordinate (A) at (1.5,0.5); |
| 43 | + \coordinate (B) at (-0.2,1.2); |
| 44 | + \draw[->, very thick, red] (0,0) -- (A) node[below right] {$\mA \vx$}; |
| 45 | + \draw[->, very thick, red] (0,0) -- (B) node[above left] {$\mA \vy$}; |
| 46 | + \draw[fill=red!20, opacity=0.6] (0,0) -- (A) -- ($(A)+(B)$) -- (B) -- cycle; |
| 47 | + \draw[dashed] (A) -- ($(A)+(B)$); |
| 48 | + \draw[dashed] (B) -- ($(A)+(B)$); |
| 49 | + \node at (0.8,0.6) {\small $\det(\mA)$}; |
| 50 | + % Orientation arc |
| 51 | + \draw[->, thick, blue] (0.4,0.0) arc[start angle=0, end angle=35, radius=0.6]; |
| 52 | +% \node[blue] at (0.25,1.25) {}; |
| 53 | +% \node at (0.8,-0.6) {transformed basis}; |
| 54 | + \end{scope} |
| 55 | +% Arrow between plots |
| 56 | + \draw[->, thick] (1.3,0.5) -- (2.4,0.5) node[midway, above] {$\mA$}; |
| 57 | + \end{tikzpicture} |
| 58 | + \end{center} |
| 59 | + \end{figure} |
28 | 60 | \\
|
29 | 61 | See also: \gls{eigenvalue}, \gls{inverse}.
|
30 | 62 | },
|
|
34 | 66 |
|
35 | 67 | \newglossaryentry{linearmap}{
|
36 | 68 | name={linear map},
|
37 |
| - description={A\index{linear map} linear map $f: \mathbb{R}^n \rightarrow \mathbb{R}^m$ is a function that satisfies additivity : |
| 69 | + description={A\index{linear map} linear map $f: \mathbb{R}^n \rightarrow \mathbb{R}^m$ is a \gls{function} |
| 70 | + that satisfies additivity : |
38 | 71 | $f(\vx + \vy) = f(\vx) + f(\vy)$, and homogeneity :
|
39 |
| - $f(c\vx) = c f(\vx)$ for all vectors $\vx, \vy \in \mathbb{R}^n$ and scalars $c \in \mathbb{R}$. In particular, $f(\mathbf{0}) = \mathbf{0}$. Any linear map can be represented as a matrix multiplication $f(\vx) = \mA \vx$ for some matrix $\mA \in \mathbb{R}^{m \times n}$. Linear maps are fundamental in \gls{linmodel}, \gls{linreg}, and \gls{pca}.\\ |
| 72 | + $f(c\vx) = c f(\vx)$ for all vectors $\vx, \vy \in \mathbb{R}^n$ and scalars $c \in \mathbb{R}$. |
| 73 | + In particular, $f(\mathbf{0}) = \mathbf{0}$. Any linear map can be represented as a matrix |
| 74 | + multiplication $f(\vx) = \mA \vx$ for some matrix $\mA \in \mathbb{R}^{m \times n}$. |
| 75 | + The collection of real-valued linear maps for a given dimension $n$ constitute a \gls{linmodel} |
| 76 | + which is used in many \gls{ml} methods. \\ |
40 | 77 | See also: \gls{linmodel}, \gls{linreg}, \gls{pca}, \gls{featurevec}.},
|
41 | 78 | first={linear map},
|
42 | 79 | text={linear map}
|
|
0 commit comments