Linear independence of eigenvectors;
a sufficient condition

Branko Ćurgus

Theorem. If $\vec{v_1}, \ldots, \vec{v_r}$ are eigenvectors that correspond to distinct eigenvalues $\lambda_1,\ldots,\lambda_r$ of an $n\times n$ matrix $A$, then the set $\bigl\{ \vec{v_1}, \ldots, \vec{v_r} \bigr\}$ is linearly independent.
Proof. In this proof we will assume that $r=3$. Assume:
  1. $\lambda_1 \neq \lambda_2, \lambda_1 \neq \lambda_3$ and $\lambda_2 \neq \lambda_3$.
  2. $\vec{v_1}, \vec{v_2}, \vec{v_3}$ are nonzero vectors.
  3. $A\vec{v_1} = \lambda_1 \vec{v_1}, A\vec{v_2} = \lambda_2 \vec{v_2}, A\vec{v_3} = \lambda_3 \vec{v_3}$.
To prove that $\vec{v_1}, \vec{v_2}, \vec{v_3}$ are linearly independent we have to prove the following implication \[ \alpha_1 \vec{v_1} + \alpha_2 \vec{v_2}+ \alpha_3 \vec{v_3} = \vec{0} \quad \Rightarrow \quad \alpha_1 = \alpha_2 = \alpha_3 = 0. \] To prove this implication we assume \begin{equation} \label{eq1} \alpha_1 \vec{v_1} + \alpha_2 \vec{v_2}+ \alpha_3 \vec{v_3} = \vec{0}. \end{equation} The first step is to apply $A$ to both sides of \eqref{eq1} to get \begin{equation*} A\bigl(\alpha_1 \vec{v_1} + \alpha_2 \vec{v_2}+ \alpha_3 \vec{v_3}\bigr) = A\vec{0}. \end{equation*} Next, we remember that this is a linear algebra class, so we use the linearity property of $A$ to get \begin{equation*} \alpha_1 A\vec{v_1} + \alpha_2 A\vec{v_2}+ \alpha_3 A\vec{v_3} = \vec{0}. \end{equation*} By the assumption 3 the last equality becomes \begin{equation} \label{eq2} \alpha_1 \lambda_1 \vec{v_1} + \alpha_2 \lambda_2 \vec{v_2}+ \alpha_3 \lambda_3 \vec{v_3} = \vec{0}. \end{equation} Next we multiply both sides of \eqref{eq1} by $\lambda_3$ to get \begin{equation} \label{eq3} \alpha_1 \lambda_3 \vec{v_1} + \alpha_2 \lambda_3 \vec{v_2}+ \alpha_3 \lambda_3 \vec{v_3} = \vec{0}. \end{equation} Now we subtract \eqref{eq3} from \eqref{eq2} to get \begin{equation} \label{eq4} \alpha_1 (\lambda_1-\lambda_3) \vec{v_1} + \alpha_2 (\lambda_2-\lambda_3) \vec{v_2} = \vec{0}. \end{equation} The equality \eqref{eq4} is similar to \eqref{eq1}, just with fewer eigenvectors. So, we repeat the steps from before; apply $A$ to \eqref{eq4} to get \begin{equation} \label{eq5} \alpha_1 (\lambda_1-\lambda_3) \lambda_1 \vec{v_1} + \alpha_2 (\lambda_2-\lambda_3)\lambda_2 \vec{v_2} = \vec{0}, \end{equation} and multiply \eqref{eq4} by $\lambda_2$ to get \begin{equation} \label{eq6} \alpha_1 (\lambda_1-\lambda_3) \lambda_2 \vec{v_1} + \alpha_2 (\lambda_2-\lambda_3)\lambda_2 \vec{v_2} = \vec{0}. \end{equation} Subtracting \eqref{eq6} from \eqref{eq5} yields \begin{equation} \label{eq7} \alpha_1 (\lambda_1-\lambda_3) (\lambda_1-\lambda_2) \vec{v_1} = \vec{0}. \end{equation} Since $\lambda_1 - \lambda_2 \neq 0, \lambda_1 - \lambda_3\neq 0$ and $\vec{v_1} \neq \vec{0}$, \eqref{eq7} implies \begin{equation}\label{eq8} \alpha_1 = 0. \end{equation} With \eqref{eq8}, equation \eqref{eq1} becomes \begin{equation} \label{eq9} \alpha_2 \vec{v_2}+ \alpha_3 \vec{v_3} = \vec{0}. \end{equation} Starting from \eqref{eq9} and repeating similar steps as before we can prove \[ \alpha_2 = 0, \qquad \alpha_3 = 0. \] This completes the proof.