# MATH 323 Lecture 4

« previous | Thursday, September 6, 2012 | next »

## Matrices (cont'd)

Constructor:

1. $\mathrm {Mat} _{n}(\mathbb {R} )$ 2. $\mathrm {M} _{n}(\mathbb {R} )$ 3. $\mathrm {M} _{m\times n}(\mathbb {R} )$ 1 and 2 are called a ring because they have the + and · operations defined on them since they are square matrices.

### Multiplication (cont'd)

$A\,B=C$ $c_{ij}=\sum _{t=1}^{n}a_{it}\,b_{tj}={\vec {a}}(i,:)\cdot {\vec {b}}(:,j)$ NOT commutative: $AB\neq BA$ Due to associativity of multiplication in #Theorem 1.3.2, we do not write parentheses around multiplication:

• $(A\,B)\,C=A\,(B\,C)=A\,B\,C$ • $A_{1}\,A_{2}\,\dots \,A_{n}$ ### Exponentiation

$A^{n}=\underbrace {A\,\cdots \,A} _{n}$ for integer $n$ ### Identity Matrix

Plays the role of "1" in multiplication:

$I=I_{n}=(\delta _{ij})$ , where $\delta _{ij}={\begin{cases}1&i=j\\0&i\neq j\end{cases}}$ (In other words, 1s along diagonal, 0s everywhere else) ...such that

$A\,I=I\,A=A$ ### Standard Basis

Acts as an identity vector:

$I=({\vec {e}}_{1},\ldots ,{\vec {e}},n)$ , where ${\vec {e}}_{i}$ is a vector of 0s, except the $i$ th element is 1.

### Zero Matrix

Plays role of 0 in addition:

$0={\begin{bmatrix}0&\dots &0\\\vdots &\ddots &\vdots \\0&\dots &0\end{bmatrix}}$ ...such that

$A+0=0+A=A$ ### Inverse Matrix

If $A\,B=B\,A=I_{n}$ for $n\times n$ matrices $A$ and $B$ , we say than $B$ is the inverse of $A$ : $B=A^{-1}$ .

Not every matrix is invertible. A matrix that does not have a multiplicative inverse is said to be singular.

### Transpose of a Matrix

For a $m\times n$ matrix $A=(a_{ij})$ , the transpose of $A$ , written $A^{T}$ , will be $n\times m$ and is defined as follows:

$A^{T}=(a_{ij}^{T})$ , where $a_{ij}^{T}=a_{ji}$ Geometrically, the first row becomes the first column, second row becomes second column, etc. The matrix is simply "reflected" about its "diagonal"

Rules:

1. $(A^{T})^{T}=A$ 2. $(\alpha \,A)^{T}=\alpha \,A^{T}$ , $\alpha \in \mathbb {R}$ 3. $(A+B)^{T}=A^{T}+B^{T}$ 4. $(A\,B)^{T}=B^{T}\,A^{T}$ (opposite order)

A $n\times n$ matrix $A$ is said to be symmetric iff $A^{T}=A$ . This means that $a_{ij}=a_{ji}$ #### Example

{\begin{aligned}A&={\begin{bmatrix}1&2\\3&0\\5&3\end{bmatrix}}\\A^{T}&={\begin{bmatrix}1&3&5\\2&0&3\end{bmatrix}}\end{aligned}} ### Diagonal Matrix

A diagonal matrix is a square ($n\times n$ ) matrix that has values along its diagonal ($\left\{a_{ij}\mid i=j\right\}$ )

A special form of diagonal matrices is when $D=\alpha \,I$ (i.e. all numbers along diagonal are the same): these special diagonal matrices commute with arbitrary matrices

## Matrices and Graphs

A graph consists of vertices (data points) and edges that connect them.

A graph with $n$ entries ($v_{1}$ through $v_{n}$ ) can be represented by a $n\times n$ adjacency matrix:

$A:a_{ij}={\begin{cases}1&v_{i}{\mbox{ is connected with }}v_{j}\\0&{\mbox{otherwise}}\end{cases}}$ First, let $a_{ij}^{(k)}$ be the ($i$ , $j$ ) entry of $A^{k}$ .

$a_{ij}^{(k)}$ represents the number of walks of length $k$ from $v_{i}$ to $v_{j}$ .

## Theorem 1.3.2

For all $\alpha ,\beta \in \mathbb {R}$ and for all $A,B,C$ , the indicated operations are defined:

1. $A+B=B+A$ (commutativity of addition)
2. $(A+B)+C=A+(B+C)$ (associativity of addition)
3. $(A\,B)\,C=A\,(B\,C)$ (associativity of multiplication)
4. $A\,(B+C)=A\,B+A\,C$ (right distributivity)
5. $(A+B)\,C=A\,C+B\,C$ (left distributivity)
6. $(\alpha \,\beta )A=\alpha (\beta \,A)$ 7. $\alpha \,(A\,B)=(\alpha \,A)\,B=A\,(\alpha \,B)$ 8. $(\alpha +\beta )\,A=\alpha \,A+\beta \,A$ 9. $\alpha \,(A+B)=\alpha \,A+\alpha \,B$ ## Theorem 1.3.3

If $A$ and $B$ are nonsingular $n\times n$ matrices, then $A\,B$ is also nonsingular and $(A\,B)^{-1}=B^{-1}\,A^{-1}$ (note the opposite order on the right-hand side)

### Corollary

For nonsingular matrices $A_{1},\ldots ,A_{k}$ , $A_{1}\,A_{2}\,\dots \,A_{k}$ is also nonsingular and $(A_{1}\,A_{2}\,\dots \,A_{k})^{-1}=A_{k}^{-1}\,\dots \,A_{2}^{-1}\,A_{1}^{-1}$ 