Results 1 to 3 of 3

Math Help - proof question about positive semidefinite matrix(important 4 regression analysis)

  1. #1
    Newbie
    Joined
    May 2009
    From
    in front of PC, obviously
    Posts
    9

    proof question about positive semidefinite matrix(important 4 regression analysis)

    this is a proof I encounter in William Greene econometric analysis appendix section.

    If A is a n*k with full column rank and n > k, then (A')(A)is positive definite and (A)(A') is positive semi-definite.

    proof given by author is as follows,

    By assumption, Ax is not equal to zero. So, x'A'Ax = (Ax)'(Ax) = y'y = summation y^2 > 0

    for the latter case, because A has more rows then columns, then there is an X such that A'x = 0, thus we can only have y'y >= 0

    What I dont understand is the the bold and underline part.


    P/S: this question comes from pg 835, of William Greene Econometric Analysis 5th edition textbook.

    Thanks in Advance !!!
    Last edited by phoenicks; May 30th 2009 at 05:09 AM.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    May 2008
    Posts
    2,295
    Thanks
    7
    Quote Originally Posted by phoenicks View Post
    this is a proof I encounter in William Greene econometric analysis appendix section.

    If A is a n*k with full column rank and n > k, then (A')(A)is positive definite and (A)(A') is positive semi-definite.

    proof given by author is as follows,

    By assumption, Ax is not equal to zero. So, x'A'Ax = (Ax)'(Ax) = y'y = summation y^2 > 0
    A has full column rank means that the columns of A are linearly independent. see that if v_1, \cdots , v_k are the columns of A and \bold{x}=[x_1 \ x_2 \cdots \ x_k]^T, then A \bold{x}=x_1v_1 + \cdots + x_kv_k.

    so if A \bold{x}=\bold{0}, then x_1v_1 + \cdots + x_k v_k = 0, and thus x_j = 0, for all j, because v_1, \cdots , v_k are linearly independent. hence the only solution of A \bold{x}=\bold{0} is \bold{x}=\bold{0}.


    for the latter case, because A has more rows then columns, then there is an X such that A'x = 0, thus we can only have y'y >= 0

    What I dont understand is the the bold and underline part.


    P/S: this question comes from pg 835, of William Greene Econometric Analysis 5th edition textbook.

    Thanks in Advance !!!
    i think by A' you mean A^T, the transpose of A. assuming that the entries of A come from a field F, the matrix A^T has n colums and these columns are in F^k. we know that \dim F^k = k.

    thus if n > k, then any n vectors in F^k are linearly dependent. now A^T has n columns, say w_1, \cdots , w_n, and n > k. so they are linearly dependent, i.e. there exists \bold{0} \neq \bold{x}=[x_1 \ x_2 \cdots \ x_n]^T

    such that x_1w_1 + \cdots + x_nw_n = \bold{0}. this also can be written as: A^T \bold{x} = \bold{0}.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    May 2009
    From
    in front of PC, obviously
    Posts
    9
    thank you !!! brilliant and yet understand prrof
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Calculating Covariance Matrix, Regression Analysis
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: November 27th 2011, 08:07 AM
  2. positive semidefinite gram matrix
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: May 26th 2010, 09:24 PM
  3. Positive semidefinite matrices
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: May 5th 2010, 01:43 PM
  4. Positive semidefinite matrix
    Posted in the Advanced Algebra Forum
    Replies: 6
    Last Post: February 15th 2010, 01:54 PM
  5. Replies: 0
    Last Post: June 4th 2008, 02:39 PM

Search Tags


/mathhelpforum @mathhelpforum