The matrix-vector product $\mathbf{A} \mathbf{x}$ is fundamentally a linear combination of the columns of $\mathbf{A}$, where the components of $\mathbf{x}$ serve as the scalar weights. Consider a matrix $\mathbf{A} \in \mathbb{R}^{m \times n}$ and a vector $\mathbf{x} \in \mathbb{R}^n$:
$$\mathbf{A} \mathbf{x} = \begin{pmatrix} \mathbf{a}_1 & \mathbf{a}_2 & \cdots & \mathbf{a}_n \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \\ \vdots \\ x_n \end{pmatrix} = x_1 \mathbf{a}_1 + x_2 \mathbf{a}_2 + \cdots + x_n \mathbf{a}_n$$
For example, given:
$$\mathbf{A} \mathbf{x} = \begin{pmatrix} 2 & 0 \\ 1 & 1 \\ 1 & 3 \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = x_1 \begin{pmatrix} 2 \\ 1 \\ 1 \end{pmatrix} + x_2 \begin{pmatrix} 0 \\ 1 \\ 3 \end{pmatrix} = x_1 \mathbf{a}_1 + x_2 \mathbf{a}_2$$
A set of vectors $\{\mathbf{a}_1, \mathbf{a}_2, \dots, \mathbf{a}_n\}$ is considered linearly independent if no vector in the set can be defined as a linear combination of the others. Mathematically, this means the equation:$$x_1 \mathbf{a}_1 + x_2 \mathbf{a}_2 + \dots + x_n \mathbf{a}_n = \mathbf{0}$$has only the trivial solution: $x_1 = x_2 = \dots = x_n = 0$.
If there exists a set of weights where at least one $x_i \neq 0$ that satisfies the equation, the vectors are linearly dependent. In such a case, at least one vector is "redundant" and does not contribute a new dimension to the span.
The Span of a set of vectors $\{\mathbf{a}_1, \mathbf{a}_2, \dots, \mathbf{a}_n\}$ is the collection of all possible vectors that can be reached by forming linear combinations of that set. Mathematically, any vector $\mathbf{b}$ in the span can be expressed as:$$\mathbf{b} = x_1 \mathbf{a}_1 + x_2 \mathbf{a}_2 + \dots + x_n \mathbf{a}_n$$where $x_i$ are any real scalars.
Geometric Intuition
The span describes the "geometric footprint" created by the vectors:
The Role of Redundancy
If a set of vectors is linearly dependent, adding a redundant vector does not expand the span. For example, if $\mathbf{a}_3$ is already a linear combination of $\mathbf{a}_1$ and $\mathbf{a}_2$, then the span of $\{\mathbf{a}_1, \mathbf{a}_2, \mathbf{a}_3\}$ is exactly the same as the span of $\{\mathbf{a}_1, \mathbf{a}_2\}$. In this case, $\mathbf{a}_3$ does not "contribute a new dimension."
The null space of a matrix $\mathbf{A}$ is the set of all vectors $\mathbf{x}$ that satisfy the homogeneous equation: $$\mathbf{A}\mathbf{x} = \mathbf{0}$$
Connection between Independence and the Null Space
This is where the concepts of independence and the null space converge:
While the null space describes the "redundancy" of the input, the column space describes the "reach" of the output. It is the set of all possible linear combinations of the columns of $\mathbf{A}$
The column space (or range) of a matrix $\mathbf{A}$ is denoted as $C(\mathbf{A})$. Geometrically, if the columns are linearly independent, they span a subspace of $\mathbb{R}^m$. In the example above, $C(\mathbf{A})$ is the plane in $\mathbb{R}^3$ spanned by $\mathbf{a}_1$ and $\mathbf{a}_2$.
For the linear system $\mathbf{A}\mathbf{x} = \mathbf{b}$ to be consistent (possess at least one solution), the vector $\mathbf{b}$ must reside within the column space:
Existence Condition: $\mathbf{A}\mathbf{x} = \mathbf{b}$ has a solution if and only if $\mathbf{b} \in C(\mathbf{A})$.
If $\mathbf{b}$ lies outside this subspace, no combination of the columns of $\mathbf{A}$ can reach it, rendering the system inconsistent.
If there are non-zero vectors in the null space, then any solution to $\mathbf{A}\mathbf{x} = \mathbf{b}$ is not unique, as any vector $\mathbf{x}_{null} \in N(\mathbf{A})$ can be added to a particular solution $\mathbf{x}_p$ without changing the result:$$\mathbf{A}(\mathbf{x}_p + \mathbf{x}_{null}) = \mathbf{A}\mathbf{x}_p + \mathbf{A}\mathbf{x}_{null} = \mathbf{b} + \mathbf{0} = \mathbf{b}$$
Relation to Span
The Column Space $C(\mathbf{A})$ of a matrix is defined as the span of its column vectors. Therefore, when we ask if the equation $\mathbf{A}\mathbf{x} = \mathbf{b}$ has a solution, we are asking a geometric question: "Is the vector $\mathbf{b}$ located within the span of the columns of $\mathbf{A}$?"
The rank of a matrix, denoted as $\text{rank}(\mathbf{A})$, is a single number that summarizes the "dimensions" of the information contained within the matrix. It is formally defined in two equivalent ways:
In our earlier example, because $\mathbf{a}_1$ and $\mathbf{a}_2$ are not multiples of one another, they are linearly independent. Therefore, $\text{rank}(\mathbf{A}) = 2$.