# Example Problems

### Example 1

Show whether, for all $$x, y \in \mathbb{R}$$, $$\begin{bmatrix}x \ y \ 0\end{bmatrix}$$ forms a subspace of $$\mathbb{R}^3$$.

Property 1:

Let $$\vec{u} = \begin{bmatrix}x\_1 \ y\_1 \ 0\end{bmatrix}$$ and $$\vec{v} = \begin{bmatrix}x\_2 \ y\_2 \ 0\end{bmatrix}$$ for any $$x\_1, x\_2, y\_1, y\_2 \in \mathbb{R}$$. We observe that $$\vec{u}$$ and $$\vec{v}$$ are in the subspace.

$$\vec{u} + \vec{v} = \begin{bmatrix}x\_1 \ x\_2 \ 0\end{bmatrix} + \begin{bmatrix}x\_2 \ y\_2 \ 0\end{bmatrix} = \begin{bmatrix}x\_1 + x\_2 \ y\_1 + y\_2 \ 0\end{bmatrix}$$

$$\begin{bmatrix}x\_1 + x\_2 \ y\_1 + y\_2 \ 0\end{bmatrix}$$ is clearly in the subspace, so property 1 is satisfied.

Property 2:

Let $$\vec{u} = \begin{bmatrix}x \ y \ 0\end{bmatrix}$$ and $$c \in \mathbb{R}$$.

$$c\vec{u} = c\begin{bmatrix}x \ y \ 0\end{bmatrix} = \begin{bmatrix}cx \ cy \ 0\end{bmatrix}$$

$$\begin{bmatrix}cx \ cy \ 0\end{bmatrix}$$ is also in the subspace, so property 2 is satisfied as well.

Therefore, $$\begin{bmatrix}x \ y \ 0\end{bmatrix}$$ forms a subspace of $$\mathbb{R}^3$$.

### Example 2

Show whether $$\begin{bmatrix}1 \ 0 \ -1\end{bmatrix}t + \begin{bmatrix}2 \ 1 \ -3\end{bmatrix}$$ forms a subspace of $$\mathbb{R}^3$$ for $$t \in \mathbb{R}$$.

Property 1:

Let $$\vec{u} = \begin{bmatrix}1 \ 0 \ -1\end{bmatrix} \cdot 1 + \begin{bmatrix}2 \ 1 \ -3\end{bmatrix} = \begin{bmatrix}3 \ 1 \ -4\end{bmatrix}$$ and $$\vec{v} = \begin{bmatrix}1 \ 0 \ -1\end{bmatrix} \cdot 2 + \begin{bmatrix}2 \ 1 \ -3\end{bmatrix} = \begin{bmatrix}4 \ 1 \ -5\end{bmatrix}$$.

$$\vec{u} + \vec{v} = \begin{bmatrix}3 \ 1 \ -4\end{bmatrix} + \begin{bmatrix}4 \ 1 \ -5\end{bmatrix} = \begin{bmatrix}7 \ 2 \ -9\end{bmatrix}$$

Let us try to find the corresponding $$t$$ value for $$\begin{bmatrix}7 \ 2 \ -9\end{bmatrix}$$.

$$\begin{bmatrix}1 \ 0 \ -1\end{bmatrix} \cdot t + \begin{bmatrix}2 \ 1 \ -3\end{bmatrix} = \begin{bmatrix}7 \ 2 \ -9\end{bmatrix}$$\
$$\begin{bmatrix}1 \ 0 \ -1\end{bmatrix} \cdot t = \begin{bmatrix}5 \ 1 \ -6\end{bmatrix}$$

There is no $$t \in \mathbb{R}$$ that fulfills the equation. Therefore, $$\vec{u} + \vec{v}$$ is **not** in the subspace, so $$\begin{bmatrix}1 \ 0 \ -1\end{bmatrix}t + \begin{bmatrix}2 \ 1 \ -3\end{bmatrix}$$ does not form a subspace.

Graphically, $$\begin{bmatrix}1 \ 0 \ -1\end{bmatrix}t + \begin{bmatrix}2 \ 1 \ -3\end{bmatrix}$$ is a line in $$\mathbb{R}^3$$ that does **not** pass through the origin. However, according to the scaling property, $$\vec{0}$$ should always be part of a subspace since we can multiply any $$\vec{u}$$ in this subspace with $$c = 0$$ to get the zero vector. Therefore, $$\begin{bmatrix}1 \ 0 \ -1\end{bmatrix}t + \begin{bmatrix}2 \ 1 \ -3\end{bmatrix}$$ is not a valid subspace.

> If $$W$$ is a subset of vectors but $$\vec{0} \not\in W$$, then $$W$$ is not a valid subspace.

### Example 3

Show that the null space of an $$m \times n$$ matrix $$\textbf{A}$$ forms a subspace of $$\mathbb{R}^n$$.

Let $$\vec{u}, \vec{v}\in \text{Null}(\textbf{A})$$. Then $$\textbf{A}\vec{u} = \vec{0}$$ and $$\textbf{A}\vec{v} = \vec{0}$$.

Property 1:

Let us check whether $$\vec{u} + \vec{v}$$ is in the null space of $$\textbf{A}$$.

$$\textbf{A}(\vec{u} + \vec{v}) = \textbf{A}\vec{u} + \textbf{A}\vec{v} = \vec{0} + \vec{0} = \vec{0}$$

Therefore, property 1 is satisfied.

Property 2:

Let us check whether $$c\vec{u}$$ is in the null space of $$\textbf{A}$$ for any $$c \in \mathbb{R}$$.

$$\textbf{A}(c\vec{v}) = c\textbf{A}\vec{v} = c\vec{0} = \vec{0}$$

Therefore, property 2 is satisfied as well, so the null space of a matrix forms a subspace.

## Fundamental Subspaces

### What and why

For each matrix, there are four fundamental subspaces: the column space, the null space, the row space, and the left null space. In this section, we will explore the properties of each of these subspaces and see how they relate to one another.

### Column space

The **column space**, or **rank**, of an $$m \times n$$ matrix $$\textbf{A}$$ refers to the range of $$\textbf{A}$$, or the span of its column vectors.

$$\text{Col}(\textbf{A}) = {\textbf{A}\vec{v} \in \mathbb{R}^m \mid \vec{v} \in \mathbb{R}^n} = \text{span}\left{\vec{a}\_1, \vec{a}\_2, \ldots, \vec{a}\_n\right}$$

We can visualize the column space as the set of all vectors that are the "output" of the linear transformation given by $$\textbf{A}$$.

We note that the column space of $$\textbf{A}$$ is a subspace of $$\mathbb{R}^m$$. The dimension of the column space of $$\textbf{A}$$ is the number of pivots, or the number of linearly independent column vectors, in $$\textbf{A}$$.

### Null space

The **null space** of an $$m \times n$$ matrix $$\textbf{A}$$ refers to all vectors that map to the zero vector $$\vec{0}$$ when the linear transformation given by $$\textbf{A}$$ is applied on these vectors.

$$\text{Null}(\textbf{A}) = {\vec{v} \in \mathbb{R}^n \mid \textbf{A}\vec{v} = \vec{0}}$$

We note that the null space of $$\textbf{A}$$ is a subspace of $$\mathbb{R}^n$$. The dimension of the null space of $$\textbf{A}$$ is the number of free variables in the row echelon form of $$\textbf{A}$$.

### Row space

The **row space** of an $$m \times n$$ matrix $$\textbf{A}$$ refers to range of $$\textbf{A}^T$$, or the span of its row vectors.

$$\text{Row}(\textbf{A}) = {\textbf{A}^T\vec{v} \in \mathbb{R}^n \mid \vec{v} \in \mathbb{R}^m} = \text{span}\left{\vec{r}\_1^T, \vec{r}\_2^T, \ldots, \vec{r}\_m^T\right}$$

We note that the row space of $$\textbf{A}$$ is a subspace of $$\mathbb{R}^n$$. The dimension of the row space of $$\textbf{A}$$ is the number of pivots in $$\textbf{A}$$ since the number of pivots in $$\textbf{A}$$ equals the number of pivots in $$\textbf{A}^T$$ after row reduction.

### Left null space

The **left null space** of an $$m \times n$$ matrix $$\textbf{A}$$ refers to all vectors that map to the zero vector $$\vec{0}$$ when the linear transformation given by $$\textbf{A}^T$$ is applied on these vectors.

$$\text{Null}(\textbf{A}^T) = {\vec{v} \in \mathbb{R}^m \mid \textbf{A}^T\vec{v} = \vec{0}}$$

We note that the left null space of $$\textbf{A}$$ is a subspace of $$\mathbb{R}^m$$. The dimension of the null space of $$\textbf{A}$$ is the number of free variables in the row echelon form of $$\textbf{A}^T$$.

This subspace is called the *left* null space (while the "usual" null space is sometimes called the *right* null space) because $$\textbf{A}^T\vec{v} = \vec{0} \implies \vec{v}^T\textbf{A} = \vec{0}^T$$. In the latter equation, we are multiplying the vector $$\vec{v}^T$$ on the left of the matrix $$\textbf{A}$$.

### Dimensions

Let $$r$$ denote the number of pivots in an $$m \times n$$ matrix $$\textbf{A}$$. Then we quickly observe the following:

$$\dim(\text{Col}(\textbf{A})) = r; \dim(\text{Null}(\textbf{A})) = n - r$$\
$$\dim(\text{Col}(\textbf{A}^T)) = r; \dim(\text{Null}(\textbf{A}^T)) = m - r$$

Note that $$\dim(\text{Col}(\textbf{A})) + \dim(\text{Null}(\textbf{A})) = n$$. This is called the **Rank Theorem**.

You should also note the two following equations:

1. Terms in $$\mathbb{R}^m$$: $$\dim(\text{Col}(\textbf{A})) + \dim(\text{Null}(\textbf{A}^T)) = m$$
2. Terms in $$\mathbb{R}^n$$: $$\dim(\text{Null}(\textbf{A})) + \dim(\text{Col}(\textbf{A}^T)) = n$$

Using orthogonal complements, we can show why these two equations are guaranteed to be true.

### Orthogonal complements

Let us examine the null space of an $$m \times n$$ matrix $$\textbf{A}$$.

We know that for every vector $$\vec{v} \in \text{Null}(\textbf{A})$$,

$$\textbf{A}\vec{v} = \vec{0}$$

Using the row vectors of $$\textbf{A}$$, we get

$$\textbf{A}\vec{v} = \begin{bmatrix}\vec{r}\_1^T \ \vec{r}\_2^T \ \vdots \ \vec{r}\_m^T\end{bmatrix}\vec{v} = \vec{0}$$

Writing out each component, we get $$\vec{r}\_1^T\vec{v} = \vec{r}\_2^T\vec{v} = \cdots = \vec{r}\_n^T\vec{v} = 0$$. We observe that these are just inner products, so $$\langle \vec{r}\_1, \vec{v} \rangle = \langle \vec{r}\_2, \vec{v} \rangle = \cdots = \langle \vec{r}\_m, \vec{v} \rangle = 0$$. This means that $$\vec{v}$$ is **orthogonal** to all of the row vectors of $$\textbf{A}$$.

Given that $$\vec{v} \in \text{Null}(\textbf{A})$$, we note that the null space of $$\textbf{A}$$ is **orthogonal** to the row space of $$\textbf{A}$$.

Since these two subspaces are orthogonal to each other, from the equation $$\dim(\text{Null}(\textbf{A})) + \dim(\text{Col}(\textbf{A}^T)) = n$$, we see that $$\text{Null}(\textbf{A}) + \text{Col}(\textbf{A}^T) = \mathbb{R}^n$$.

Therefore, we call $$\text{Null}(\textbf{A})$$ and $$\text{Col}(\textbf{A}^T)$$ **orthogonal complements** because

1. they are orthogonal to each other and because
2. they span $$\mathbb{R}^n$$.

Similarly, let us examine the left null space of $$\textbf{A}$$.

We know that for every vector $$\vec{v} \in \text{Null}(\textbf{A}^T)$$,

$$\textbf{A}^T\vec{v} = \vec{0} \implies \vec{v}^T\textbf{A} = \vec{0}^T$$

Using the column vectors of $$\textbf{A}$$, we get

$$\vec{v}^T\textbf{A} = \vec{v}^T\begin{bmatrix}\vec{a}\_1 & \vec{a}\_2 & \cdots & \vec{a}\_n\end{bmatrix} = \vec{0}^T$$

Writing out each equation, we get, just like above, $$\langle \vec{v}, \vec{a}\_1 \rangle = \langle \vec{v}, \vec{a}\_2 \rangle = \cdots = \langle \vec{v}, \vec{a}\_n \rangle = 0$$.

Therefore, $$\vec{v}$$ is orthogonal to each column vector of $$\textbf{A}$$. Using similar reasoning as above, we conclude that the column space of $$\textbf{A}$$ and the left null space of $$\textbf{A}$$ are orthogonal complements because

1. they are orthogonal to each other and because
2. they span $$\mathbb{R}^m$$.
