Show that if we have an orthogonal set of vectors φ1, . . . , φk, then φ1, . . . , φk are linearly independent as well, i.e.

1 Answer

  • Let [tex]\{\varphi_i~|~i\in\mathbb N,1\le i\le k\}[/tex] be a set of orthogonal vectors. By definition of orthogonality, any pairwise dot product between distinct vectors must be zero, i.e.

    [tex]\varphi_i\cdot\varphi_j=\begin{cases}\|\varphi_i\|^2&\text{if }i=j\\0&\text{if }i\neq j\end{cases}[/tex]

    Suppose there is some linear combination of the [tex]\varphi_i[/tex] such that it's equivalent to the zero vector. In other words, assume they are linearly dependent and that there exist [tex]c_i\in\mathbb R[/tex] (not all zero) such that

    [tex]\displaystyle\sum_{i=1}^kc_i\varphi_i=c_1\varphi_1+\cdots+c_k\varphi_k=\mathbf 0[/tex]

    (This is our hypothesis)

    Take the dot product of both sides with any vector from the set:

    [tex]v_j\cdot\displaystyle\sum_{i=1}^kc_i\varphi_i=c_1\varphi_j\cdot\varphi_1+\cdots+c_k\varphi_j\varphi_k=\varphi_j\cdot\mathbf 0[/tex]

    By orthogonality of the vectors, this reduces to


    Since none of the [tex]\varphi_i[/tex] are zero vectors (presumably), this means [tex]c_j=0[/tex]. This is true for all [tex]j[/tex], which means only [tex]c_i=0[/tex] will allow such a linear combination to be equivalent to the zero vector, which contradicts the hypothesis and hence the set of vectors must be linearly independent.

You May Be Interested