Results 1 to 2 of 2

Math Help - Regression: Showing Xs in a simple linear model are linearly independent in L2

  1. #1
    Newbie
    Joined
    Oct 2009
    Posts
    14

    Regression: Showing Xs in a simple linear model are linearly independent in L2

    Let X ~ exp(1), Y=e-X and consider the simple linear model Y = \alpha + \beta X + \gamma X^2 + W, where  E(W)=0=\rho (X,W) = \rho(X^2,W).
    Demonstrate that 1, X, X2 are linearly independent in L2.
    It also gives a hint: exp(1) = G(1), (gamma distribution with p=1)

    I'm not sure how to show linear independence in L2 , I'm not even quite sure what L2 means exactly. Would showing Cov(1,X) = Cov(X,X^2) = Cov(1,X^2) = 0 be enough for linear independence? I'm also no sure how to use the hint..
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor
    Joined
    Sep 2012
    From
    Australia
    Posts
    3,619
    Thanks
    592

    Re: Regression: Showing Xs in a simple linear model are linearly independent in L2

    Hey chewitard.

    Do your notes or textbooks say anything about L^2? Is this the Lebesgue space for L^2?

    You might want to see if this is the case and check the following:

    Lp space - Wikipedia, the free encyclopedia
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. [SOLVED] Column vectors are linearly independent using simple test:Why?
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: June 16th 2011, 06:13 AM
  2. Replies: 2
    Last Post: May 14th 2011, 05:37 AM
  3. Replies: 1
    Last Post: November 29th 2009, 11:13 AM
  4. Simple linear regression model
    Posted in the Business Math Forum
    Replies: 0
    Last Post: April 6th 2009, 06:58 AM
  5. Help with a simple regression model proof
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: April 5th 2009, 07:07 PM

Search Tags


/mathhelpforum @mathhelpforum