# Regression: Showing Xs in a simple linear model are linearly independent in L2

• Nov 21st 2012, 04:39 PM
chewitard
Regression: Showing Xs in a simple linear model are linearly independent in L2
Let X ~ exp(1), Y=e-X and consider the simple linear model $Y = \alpha + \beta X + \gamma X^2 + W$, where $E(W)=0=\rho (X,W) = \rho(X^2,W)$.
Demonstrate that 1, X, X2 are linearly independent in L2.
It also gives a hint: exp(1) = G(1), (gamma distribution with p=1)

I'm not sure how to show linear independence in L2 , I'm not even quite sure what L2 means exactly. Would showing Cov(1,X) = Cov(X,X^2) = Cov(1,X^2) = 0 be enough for linear independence? I'm also no sure how to use the hint..
• Nov 21st 2012, 09:36 PM
chiro
Re: Regression: Showing Xs in a simple linear model are linearly independent in L2
Hey chewitard.

Do your notes or textbooks say anything about L^2? Is this the Lebesgue space for L^2?

You might want to see if this is the case and check the following:

Lp space - Wikipedia, the free encyclopedia