# Basics

## What is a matrix?

$$A \in \mathcal{R}^{m \times n}, A = a_{ij}$$

$$m$$ = number of rows
$$n$$ = number of columns
$$m=n \Rightarrow$$ "square matrix"

If $$A$$ and $$B$$ are the same size,

$$A, B \in \mathcal{R}^{m \times n}$$
$$\Rightarrow (A+B)_{ij} = A_{ij} + B_{ij}$$

It is associative.
$$(A+B)+C = A+(B+C)$$

It is commutative.
$$A+B = B+A$$

the matrix of all zeros (written as $$0$$).
$$0_{ij} = 0 \; \forall i, j$$
$$A+0 = 0+A = A$$

## Multiplication

If $$A$$ has the same number of columns as $$B$$ has rows,
then we can multiply them.

$$A\in \mathcal{R}^{m \times n}, B\in \mathcal{R}^{n \times p} \Rightarrow AB\in \mathcal{R}^{m \times p}$$
$$(AB)_{ij} = \sum_{k=1}^n a_{ik} b_{kj}$$

This is the dot product
of the $$i$$th row of $$A$$
with the $$j$$th column of $$B$$.

## Properties of Multiplication

It is associative.
$$(AB)C$$ = $$A(BC)$$

It is not necessarily commutative.
$$AB \stackrel{?}{=} BA$$

The identity for multiplication is the matrix
with $$1$$s on the diagonals and zero otherwise.
It is written as $$I$$.
$$I_{ij} = 1 \mbox{ if } i=j, 0 \mbox{ otherwise}$$
$$IA = AI = A$$

## Transpose

The transpose of a matrix $$A$$ is another matrix
with the rows and columns of $$A$$ inverted.
It is written as $$A^T$$.

$$A^T_{ij} = A_{ji}$$

Note: $$(A^T)^T = A$$

# Inverse of a Matrix

## What is it?

The inverse of a square matrix $$A$$ is a matrix that
multiplies with $$A$$ to make the identity matrix.
It is written as $$A^{-1}$$.

$$AA^{-1} = A^{-1}A = I$$

BUT it does not exist for all $$A$$.

## Why is it important?

Invertibility comes up again and again in linear algebra.
We'll examine four different use cases.

# Row Operations

## Elementary Row Operations

There are only three!

1. Switch two rows: $$r_i, r_j \rightarrow r_j, r_i$$
2. Multiply a row by a constant: $$r_i \rightarrow Cr_i$$
3. Add two rows together: $$r_i \rightarrow r_i + r_j$$

You can also do 2 and 3 at the same time:
$$r_i \rightarrow C_1 r_i + C_2 r_j$$

## Can you invert a matrix? (Part 1)

$$A$$ is invertible if
it can be reduced to $$I$$
using elementary row operations.

$$A^{-1}$$ is the result of
the same operations applied to $$I$$.

The method to do this
on the augmented matrix $$[A \; \vert \; I]$$
is called Gaussian elimination.

## Intuition for inverting Part 1

By going from $$A$$ to $$I$$ with row operations,
we basically multiply by $$A^{-1}$$ (since $$AA^{-1}=I$$).

Furthermore, since $$IA^{-1} = A^{-1}$$,
we can do the same operations on $$I$$
to get $$A^{-1}$$ explicitly.

# Linear independence

## Vectors

A vector $$v$$ is a matrix with one column.

$$v \in R^n$$

## Linear independence

A set of vectors $$\{v_i\}_{i=1}^n$$ is linearly independent if
the only linear combination equal to zero
has all coefficients equal to zero.

$$\{v_i\}_{i=1}^n$$ linearly independent
$$\Longrightarrow \left( \sum_{i=1}^n c_i v_i = 0 \Leftrightarrow c_i = 0 \; \forall i \right)$$

Alternately, $$\{v_i\}_{i=1}^n$$ is linearly independent if
no vector $$v_j$$ can be created from
a linear combination of the other vectors.

## Can you invert a matrix? (Part 2)

$$A$$ is invertible if
its rows are linearly independent.

$$A$$ is also invertible if
its columns are linearly independent.

# Determinants

## What is it?

The determinant is a value
associated with square matrices.
It is written as $$\det(A)$$.

## Can you invert a matrix? (Part 3)

A square matrix $$A$$ is invertible if
$$\det(A) \neq 0$$.

## How do you calculate it?

It's best shown with examples.

## 2 x 2 Determinant

$A = \left(\begin{array}{cc} a & b\\ c & d\\ \end{array}\right)$
$$\Rightarrow \det(A) = ad-bc$$

## 3 x 3 Determinant

$A = \left( \begin{array}{ccc} a_1 & a_2 & a_3\\ b_1 & b_2 & b_3\\ c_1 & c_2 & c_3\\ \end{array} \right)$
$$\Rightarrow \det(A) = a_1(b_2c_3-b_3c_2) -$$
$$a_2(b_1c_3-b_3c_1) +$$
$$a_3(b_1c_2-b_2c_1)$$

## n x n Determinant

For any row $$A_i$$, $\det(A) = \sum_{j=1}^n a_{ij} \cdot (-1)^{i+j} \cdot M_{ij},$ where $$M_{ij}$$ is the (i, j) minor matrix.

# Eigenvalues,Eigenvectors

## What are they?

For a square matrix $$A$$,
we often examine special vectors $$v$$
that do not rotate when $$A$$ is applied.

That is, for a certain scalar $$\lambda$$,
$$Av = \lambda v$$.

In this case, we call $$\lambda$$ and $$v$$
an eigenvalue and eigenvector of $$A$$.

## Why are they important?

After doing a bit of algebra, we can notice:

$$Av = \lambda v$$
$$\Rightarrow Av - \lambda Iv = 0$$
$$\Rightarrow (A - \lambda I)v = 0.$$

Now we can make an assertion...

## Can you invert a matrix? (Part 4)

#### Assertion

If $$v \neq 0$$, then $$(A-\lambda I)$$ is not invertible.
Thus, $$\det{(A-\lambda I)} = 0$$.

#### Proof

Suppose $$(A-\lambda I)^{-1}$$ exists.

$$(A-\lambda I)v = 0$$
$$\Rightarrow (A-\lambda I)^{-1}(A-\lambda I)v = 0$$
$$\Rightarrow Iv = 0$$
$$\Rightarrow v=0.$$
This contradicts our original assumption that $$v \neq 0$$.

QED

## How do I find eigenvalues?

The $$v=0$$ case is boring.
$$A\times 0 = 0 = \lambda \times 0 \Rightarrow \lambda$$ can be anything.

So, let's focus on $$v \neq 0$$, so that $$\det{(A-\lambda I)} = 0$$.

The left-hand side is an $$n^{th}$$-order polynomial in $$\lambda$$.
Thus, by the Fundamental Theorem of Algebra,
every square matrix has $$n$$ eigenvalues.

However, the eigenvalues are not necessarily distinct.

## How do I find eigenvectors?

Suppose the eigenvalues $$\{\lambda_i\}_{i=1}^n$$ are all distinct.
To get the eigenvectors $$\{v_i\}_{i=1}^n$$ , solve the original equations:
$$(A - \lambda_i)v_i = 0, i = 1,\ldots,n$$.

This produces $$n$$ simultaneous linear equations.
Then, write each $$v_i$$ in terms of one common component.
Finally, factor the common component out.

For example, if at first we obtain $$(v_1, 3v_1, 2v_1)$$,
then the actual eigenvector is simply $$(1, 3, 2)$$.