Results 1 to 2 of 2

Math Help - Multiple regression coefficents

  1. #1
    Newbie
    Joined
    Dec 2009
    Posts
    9

    Multiple regression coefficents

    Hi there, im currently working on a research project that requires making usage of multiple regression.

    I am really struggling to find any proofs or methods of derivation for the coefficients used in the linear regression line. I know for normal simple regression we use least means squared which then allows for two simple forumulas to find say B0 and B1. But in the multiple case, where we may have B0, B1, B2,....

    how do you find these?

    Many online sources just seem to describe it as ugly, and skip this part out. or resort to a computing programme.

    Many Thanks

    Sam
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Senior Member
    Joined
    Oct 2009
    Posts
    340
    Quote Originally Posted by saambre View Post
    Hi there, im currently working on a research project that requires making usage of multiple regression.

    I am really struggling to find any proofs or methods of derivation for the coefficients used in the linear regression line. I know for normal simple regression we use least means squared which then allows for two simple forumulas to find say B0 and B1. But in the multiple case, where we may have B0, B1, B2,....

    how do you find these?

    Many online sources just seem to describe it as ugly, and skip this part out. or resort to a computing programme.

    Many Thanks

    Sam
    For ordinary least squares regression, it isn't particularly ugly. You just need some background in linear algebra. The linear model is <br />
Y = X\beta + \epsilon<br />
    where Y is an (n x 1) random vector, X is an (n x p) matrix of known coefficients, and beta is a (p x 1) vector of unknown parameters; epsilon is an (n x 1) vector of random errors (mean 0, typically with iid normal components).

    A reasonable goal under these circumstances is to minimize
     Q(\beta) = \|Y - X\beta\|^2
    where \|\cdot\|^2 is the Euclidean length of a vector squared. It can be shown that the value of \beta that satisfies this is given by
    <br />
\hat{\beta} = (X'X)^{-1} X'Y<br />
    This is the least squares estimate of beta. The proof is not very difficult if you have the appropriate machinery from matrix algebra (projections and so forth); you can also show it with a little bit more effort via matrix calculus (less background required) - set the gradient Q(beta) to zero and show that the Hessian is positive-definite.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Ncaaf multiple regression/ stepwise regression
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: August 10th 2011, 06:21 AM
  2. Multiple Regression
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: May 7th 2011, 11:09 AM
  3. Regression SS in multiple linear regression
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: June 20th 2009, 12:23 PM
  4. multiple regression help
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: August 15th 2008, 06:17 AM
  5. Multiple Regression
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: May 12th 2008, 06:32 AM

Search Tags


/mathhelpforum @mathhelpforum