# Math Help - QR-factorization

1. ## QR-factorization

Hi

If $A = QR \mbox{ where } Q \mbox{ is } m \times n \mbox{ and } R n\times n$ , show that if the columns of A are linearly independent, then R must be invertible.
(Hint: Study the equation $R\vec{x} = \vec{0}$ , and use the fact that $A = QR$.

Need some guidance here.
I tried something like:

$R\vec{x} = \vec{0} \Rightarrow (Q^{T}A)\vec{x}=\vec{0}$

But $Q = \left[ \begin{matrix} \vec{u_{1}} \, \; ... \, \; \vec{u_{n}} \end{matrix} \right]$
And the columns of Q are orthonormal.

So the matrix $(Q^{T}A)$ will have columns formed from the columns of $Q^{T}$ , using the weights in columns of A as weights.
Is it correct to say that even the columns of $Q^{T}$ will be linearly independent?

If so, $R = Q^{T}A$ will have linearly independent columns, and since R is n x n , R is invertible by the Invertible matrix theorem.

Thanks!

2. No one?