Results 1 to 5 of 5

Math Help - Linear Prediction Filter

  1. #1
    Newbie
    Joined
    May 2009
    Posts
    11

    Linear Prediction Filter

    can someone please help me with this.....
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Junior Member
    Joined
    Nov 2008
    Posts
    58
    Thanks
    1
    Let \hat{s} = h_1u(n-1) + h_2u(n-2) + h_3u(n-3) where h_i, i=1,2,3 are the coefficients to be determined.

    Then apply the orthogonality principle
    E[(s(n)-\hat{s})u(n-i)] = 0, i = 1,2,3
    Solve for the three coefficients using the 3 equations.

    After computing the coefficients, compute
    E[(s(n)-\hat{s})^2]
    for the MMSE.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    May 2009
    Posts
    11
    what is the the orthogonality principle?
    I couldnt get the 3 equations you mentioned. please give me more details?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Junior Member
    Joined
    Nov 2008
    Posts
    58
    Thanks
    1
    You didn't learn about it in class? Do you have a formula involving matrix inversion?

    In its simplest form, the Orthogonality Principle states that "the error vector of the optimum estimator is orthogonal to any other possible estimator."

    I assumed that you have had some linear algebra and understood the concept of inner product. If two vectors are orthogonal, then its inner product is zero, ie <\bar{x} , \bar{y}> = 0 .

    Here, the inner product of two random variables X and Y is defined to be <X , Y> = E[XY].

    Now let the optimum estimator be \hat{s} = h_1u(n-1) + h_2u(n-2) + h_3u(n-3) where h_i, i= 1, 2, 3 are the coefficients to be determined. You can think of u(n-1), u(n-2), u(n-3) as three linearly independent vectors(like the cartesian basis \hat{i},\hat{j},\hat{k}, although I should add that the u's are not orthogonal like the cartesian basis)

    The error of the optimum estimator is thus s(n) - \hat{s}.

    Now apply the orthogonality principle, and we have
    E[(s(n)-\hat{s}) A] = 0 where A is any possible estimator.

    We have three unknowns to solve for so we need three linearly independent equations. They can be obtained by substituting A with u(n-1), u(n-2), u(n-3) (each of the u's is a possible estimator by itself). So the three equations are

    E[(s(n)-\hat{s})u(n-1)] = 0
    E[(s(n)-\hat{s})u(n-2)] = 0
    E[(s(n)-\hat{s})u(n-3)] = 0

    Substitute in the expression for \hat{s}, expand out, plug in the values, and solve for the 3 unknown coefficients and you're done.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Newbie
    Joined
    May 2009
    Posts
    11
    thank you for your help.
    i would have to say that my teacher didnt go through the problems well. all theory and no examples so that we can work from.
    i am totally lost..
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. filter
    Posted in the Differential Geometry Forum
    Replies: 2
    Last Post: December 14th 2010, 01:27 PM
  2. Weather Prediction
    Posted in the Geometry Forum
    Replies: 0
    Last Post: March 23rd 2010, 02:15 PM
  3. Replies: 7
    Last Post: September 14th 2009, 02:54 PM
  4. Using R to add confidence and prediction intervals
    Posted in the Math Software Forum
    Replies: 0
    Last Post: November 3rd 2008, 10:43 AM
  5. Soccer Prediction Problem
    Posted in the Advanced Statistics Forum
    Replies: 5
    Last Post: July 14th 2008, 11:20 PM

Search Tags


/mathhelpforum @mathhelpforum