# MATH 323 Lecture 4

Jump to navigation Jump to search

« previous | Thursday, September 6, 2012 | next »

## Matrices (cont'd)

Constructor:

1. ${\displaystyle \mathrm {Mat} _{n}(\mathbb {R} )}$
2. ${\displaystyle \mathrm {M} _{n}(\mathbb {R} )}$
3. ${\displaystyle \mathrm {M} _{m\times n}(\mathbb {R} )}$

1 and 2 are called a ring because they have the + and · operations defined on them since they are square matrices.

### Multiplication (cont'd)

${\displaystyle A\,B=C}$

${\displaystyle c_{ij}=\sum _{t=1}^{n}a_{it}\,b_{tj}={\vec {a}}(i,:)\cdot {\vec {b}}(:,j)}$

NOT commutative: ${\displaystyle AB\neq BA}$

Due to associativity of multiplication in #Theorem 1.3.2, we do not write parentheses around multiplication:

• ${\displaystyle (A\,B)\,C=A\,(B\,C)=A\,B\,C}$
• ${\displaystyle A_{1}\,A_{2}\,\dots \,A_{n}}$

### Exponentiation

${\displaystyle A^{n}=\underbrace {A\,\cdots \,A} _{n}}$ for integer ${\displaystyle n}$

### Identity Matrix

Plays the role of "1" in multiplication:

${\displaystyle I=I_{n}=(\delta _{ij})}$, where ${\displaystyle \delta _{ij}={\begin{cases}1&i=j\\0&i\neq j\end{cases}}}$

(In other words, 1s along diagonal, 0s everywhere else) ...such that

${\displaystyle A\,I=I\,A=A}$

### Standard Basis

Acts as an identity vector:

${\displaystyle I=({\vec {e}}_{1},\ldots ,{\vec {e}},n)}$, where ${\displaystyle {\vec {e}}_{i}}$ is a vector of 0s, except the ${\displaystyle i}$th element is 1.

### Zero Matrix

Plays role of 0 in addition:

${\displaystyle 0={\begin{bmatrix}0&\dots &0\\\vdots &\ddots &\vdots \\0&\dots &0\end{bmatrix}}}$

...such that

${\displaystyle A+0=0+A=A}$

### Inverse Matrix

If ${\displaystyle A\,B=B\,A=I_{n}}$ for ${\displaystyle n\times n}$ matrices ${\displaystyle A}$ and ${\displaystyle B}$, we say than ${\displaystyle B}$ is the inverse of ${\displaystyle A}$: ${\displaystyle B=A^{-1}}$.

Not every matrix is invertible. A matrix that does not have a multiplicative inverse is said to be singular.

### Transpose of a Matrix

For a ${\displaystyle m\times n}$ matrix ${\displaystyle A=(a_{ij})}$, the transpose of ${\displaystyle A}$, written ${\displaystyle A^{T}}$, will be ${\displaystyle n\times m}$ and is defined as follows:

${\displaystyle A^{T}=(a_{ij}^{T})}$, where ${\displaystyle a_{ij}^{T}=a_{ji}}$

Geometrically, the first row becomes the first column, second row becomes second column, etc. The matrix is simply "reflected" about its "diagonal"

Rules:

1. ${\displaystyle (A^{T})^{T}=A}$
2. ${\displaystyle (\alpha \,A)^{T}=\alpha \,A^{T}}$, ${\displaystyle \alpha \in \mathbb {R} }$
3. ${\displaystyle (A+B)^{T}=A^{T}+B^{T}}$
4. ${\displaystyle (A\,B)^{T}=B^{T}\,A^{T}}$ (opposite order)

A ${\displaystyle n\times n}$ matrix ${\displaystyle A}$ is said to be symmetric iff ${\displaystyle A^{T}=A}$. This means that ${\displaystyle a_{ij}=a_{ji}}$

#### Example

{\displaystyle {\begin{aligned}A&={\begin{bmatrix}1&2\\3&0\\5&3\end{bmatrix}}\\A^{T}&={\begin{bmatrix}1&3&5\\2&0&3\end{bmatrix}}\end{aligned}}}

### Diagonal Matrix

A diagonal matrix is a square (${\displaystyle n\times n}$) matrix that has values along its diagonal (${\displaystyle \left\{a_{ij}\mid i=j\right\}}$)

A special form of diagonal matrices is when ${\displaystyle D=\alpha \,I}$ (i.e. all numbers along diagonal are the same): these special diagonal matrices commute with arbitrary matrices

## Matrices and Graphs

A graph consists of vertices (data points) and edges that connect them.

A graph with ${\displaystyle n}$ entries (${\displaystyle v_{1}}$ through ${\displaystyle v_{n}}$) can be represented by a ${\displaystyle n\times n}$ adjacency matrix:

${\displaystyle A:a_{ij}={\begin{cases}1&v_{i}{\mbox{ is connected with }}v_{j}\\0&{\mbox{otherwise}}\end{cases}}}$

First, let ${\displaystyle a_{ij}^{(k)}}$ be the (${\displaystyle i}$, ${\displaystyle j}$) entry of ${\displaystyle A^{k}}$.

${\displaystyle a_{ij}^{(k)}}$ represents the number of walks of length ${\displaystyle k}$ from ${\displaystyle v_{i}}$ to ${\displaystyle v_{j}}$.

## Theorem 1.3.2

For all ${\displaystyle \alpha ,\beta \in \mathbb {R} }$ and for all ${\displaystyle A,B,C}$, the indicated operations are defined:

1. ${\displaystyle A+B=B+A}$ (commutativity of addition)
2. ${\displaystyle (A+B)+C=A+(B+C)}$ (associativity of addition)
3. ${\displaystyle (A\,B)\,C=A\,(B\,C)}$ (associativity of multiplication)
4. ${\displaystyle A\,(B+C)=A\,B+A\,C}$ (right distributivity)
5. ${\displaystyle (A+B)\,C=A\,C+B\,C}$ (left distributivity)
6. ${\displaystyle (\alpha \,\beta )A=\alpha (\beta \,A)}$
7. ${\displaystyle \alpha \,(A\,B)=(\alpha \,A)\,B=A\,(\alpha \,B)}$
8. ${\displaystyle (\alpha +\beta )\,A=\alpha \,A+\beta \,A}$
9. ${\displaystyle \alpha \,(A+B)=\alpha \,A+\alpha \,B}$

## Theorem 1.3.3

If ${\displaystyle A}$ and ${\displaystyle B}$ are nonsingular ${\displaystyle n\times n}$ matrices, then ${\displaystyle A\,B}$ is also nonsingular and ${\displaystyle (A\,B)^{-1}=B^{-1}\,A^{-1}}$

(note the opposite order on the right-hand side)

### Corollary

For nonsingular matrices ${\displaystyle A_{1},\ldots ,A_{k}}$, ${\displaystyle A_{1}\,A_{2}\,\dots \,A_{k}}$ is also nonsingular and ${\displaystyle (A_{1}\,A_{2}\,\dots \,A_{k})^{-1}=A_{k}^{-1}\,\dots \,A_{2}^{-1}\,A_{1}^{-1}}$