# Math Help - proof question about positive semidefinite matrix(important 4 regression analysis)

1. ## proof question about positive semidefinite matrix(important 4 regression analysis)

this is a proof I encounter in William Greene econometric analysis appendix section.

If A is a n*k with full column rank and n > k, then (A')(A)is positive definite and (A)(A') is positive semi-definite.

proof given by author is as follows,

By assumption, Ax is not equal to zero. So, x'A'Ax = (Ax)'(Ax) = y'y = summation y^2 > 0

for the latter case, because A has more rows then columns, then there is an X such that A'x = 0, thus we can only have y'y >= 0

What I dont understand is the the bold and underline part.

P/S: this question comes from pg 835, of William Greene Econometric Analysis 5th edition textbook.

2. Originally Posted by phoenicks
this is a proof I encounter in William Greene econometric analysis appendix section.

If A is a n*k with full column rank and n > k, then (A')(A)is positive definite and (A)(A') is positive semi-definite.

proof given by author is as follows,

By assumption, Ax is not equal to zero. So, x'A'Ax = (Ax)'(Ax) = y'y = summation y^2 > 0
$A$ has full column rank means that the columns of $A$ are linearly independent. see that if $v_1, \cdots , v_k$ are the columns of $A$ and $\bold{x}=[x_1 \ x_2 \cdots \ x_k]^T,$ then $A \bold{x}=x_1v_1 + \cdots + x_kv_k.$

so if $A \bold{x}=\bold{0},$ then $x_1v_1 + \cdots + x_k v_k = 0,$ and thus $x_j = 0,$ for all $j,$ because $v_1, \cdots , v_k$ are linearly independent. hence the only solution of $A \bold{x}=\bold{0}$ is $\bold{x}=\bold{0}.$

for the latter case, because A has more rows then columns, then there is an X such that A'x = 0, thus we can only have y'y >= 0

What I dont understand is the the bold and underline part.

P/S: this question comes from pg 835, of William Greene Econometric Analysis 5th edition textbook.

i think by $A'$ you mean $A^T,$ the transpose of $A.$ assuming that the entries of $A$ come from a field $F,$ the matrix $A^T$ has $n$ colums and these columns are in $F^k.$ we know that $\dim F^k = k.$
thus if $n > k,$ then any $n$ vectors in $F^k$ are linearly dependent. now $A^T$ has $n$ columns, say $w_1, \cdots , w_n,$ and $n > k.$ so they are linearly dependent, i.e. there exists $\bold{0} \neq \bold{x}=[x_1 \ x_2 \cdots \ x_n]^T$
such that $x_1w_1 + \cdots + x_nw_n = \bold{0}.$ this also can be written as: $A^T \bold{x} = \bold{0}.$