1. ## QR-factorization

Hi

If $\displaystyle A = QR \mbox{ where } Q \mbox{ is } m \times n \mbox{ and } R n\times n$ , show that if the columns of A are linearly independent, then R must be invertible.
(Hint: Study the equation $\displaystyle R\vec{x} = \vec{0}$ , and use the fact that $\displaystyle A = QR$.

Need some guidance here.
I tried something like:

$\displaystyle R\vec{x} = \vec{0} \Rightarrow (Q^{T}A)\vec{x}=\vec{0}$

But $\displaystyle Q = \left[ \begin{matrix} \vec{u_{1}} \, \; ... \, \; \vec{u_{n}} \end{matrix} \right]$
And the columns of Q are orthonormal.

So the matrix $\displaystyle (Q^{T}A)$ will have columns formed from the columns of $\displaystyle Q^{T}$ , using the weights in columns of A as weights.
Is it correct to say that even the columns of $\displaystyle Q^{T}$ will be linearly independent?

If so, $\displaystyle R = Q^{T}A$ will have linearly independent columns, and since R is n x n , R is invertible by the Invertible matrix theorem.

Thanks!

2. No one?