\(m\) = number of rows

\(n\) = number of columns

\(m=n \Rightarrow \) "square matrix"

then add component by component.

\( A, B \in \mathcal{R}^{m \times n}\)

\(\Rightarrow (A+B)_{ij} = A_{ij} + B_{ij} \)

\( (A+B)+C = A+(B+C) \)

It is

\( A+B = B+A \)

The

the matrix of all zeros (written as \(0\)).

\( 0_{ij} = 0 \; \forall i, j \)

\( A+0 = 0+A = A \)

then we can multiply them.

\( A\in \mathcal{R}^{m \times n}, B\in \mathcal{R}^{n \times p} \Rightarrow AB\in \mathcal{R}^{m \times p}\)

\( (AB)_{ij} = \sum_{k=1}^n a_{ik} b_{kj} \)

This is the dot product

of the \(i\)th row of \(A\)

with the \(j\)th column of \(B\).

\((AB)C\) = \(A(BC)\)

It is

\(AB \stackrel{?}{=} BA\)

The

with \(1\)s on the diagonals and zero otherwise.

It is written as \(I\).

\( I_{ij} = 1 \mbox{ if } i=j, 0 \mbox{ otherwise} \)

\( IA = AI = A \)

with the rows and columns of \(A\) inverted.

It is written as \( A^T \).

\( A^T_{ij} = A_{ji} \)

multiplies with \(A\) to make the identity matrix.

It is written as \(A^{-1}\).

\( AA^{-1} = A^{-1}A = I \)

We'll examine

There are only three!

- Switch two rows: \( r_i, r_j \rightarrow r_j, r_i\)
- Multiply a row by a constant: \( r_i \rightarrow Cr_i\)
- Add two rows together: \( r_i \rightarrow r_i + r_j\)

\( r_i \rightarrow C_1 r_i + C_2 r_j\)

it can be reduced to \(I\)

using elementary row operations.

\(A^{-1}\) is the result of

the same operations applied to \(I\).

The method to do this

on the

is called Gaussian elimination.

we basically multiply by \(A^{-1}\) (since \(AA^{-1}=I\)).

Furthermore, since \(IA^{-1} = A^{-1}\),

we can do the same operations on \(I\)

to get \(A^{-1}\) explicitly.

\( v \in R^n \)

the only linear combination equal to zero

has all coefficients equal to zero.

\( \{v_i\}_{i=1}^n \) linearly independent

\( \Longrightarrow \left( \sum_{i=1}^n c_i v_i = 0 \Leftrightarrow c_i = 0 \; \forall i \right) \)

Alternately, \(\{v_i\}_{i=1}^n\) is linearly independent if

no vector \(v_j\) can be created from

a linear combination of the other vectors.

its

\(A\) is also invertible if

its

associated with

It is written as \( \det(A) \).

\( \det(A) \neq 0 \).

\( \Rightarrow \det(A) = ad-bc \)

\( \Rightarrow \det(A) = a_1(b_2c_3-b_3c_2) - \)

\( a_2(b_1c_3-b_3c_1) + \)

\( a_3(b_1c_2-b_2c_1)\)

Eigenvectors

we often examine special vectors \(v\)

that do not rotate when \(A\) is applied.

That is, for a certain scalar \(\lambda\),

\(Av = \lambda v\).

In this case, we call \(\lambda\) and \(v\)

an eigenvalue and eigenvector of \(A\).

\( Av = \lambda v \)

\( \Rightarrow Av - \lambda Iv = 0 \)

\( \Rightarrow (A - \lambda I)v = 0. \)

Now we can make an assertion...

Thus, \(\det{(A-\lambda I)} = 0\).

\( (A-\lambda I)v = 0 \)

\( \Rightarrow (A-\lambda I)^{-1}(A-\lambda I)v = 0 \)

\( \Rightarrow Iv = 0\)

\( \Rightarrow v=0. \)

This contradicts our original assumption that \(v \neq 0\).

\( A\times 0 = 0 = \lambda \times 0 \Rightarrow \lambda\) can be anything.

So, let's focus on \(v \neq 0\), so that \(\det{(A-\lambda I)} = 0\).

The left-hand side is an \(n^{th}\)-order polynomial in \(\lambda \).

Thus, by the Fundamental Theorem of Algebra,

every square matrix has \(n\) eigenvalues.

However, the eigenvalues are

To get the eigenvectors \( \{v_i\}_{i=1}^n \) , solve the original equations:

\( (A - \lambda_i)v_i = 0, i = 1,\ldots,n \).

This produces \(n\) simultaneous linear equations.

Then, write each \(v_i\) in terms of one common component.

Finally, factor the common component out.

For example, if at first we obtain \( (v_1, 3v_1, 2v_1) \),

then the actual eigenvector is simply \( (1, 3, 2) \).

Course 18.06SC: MIT's famous linear algebra course

Linear Algebra Done Right: a canonical textbook

Email me.

Tweet @onkursen.