Results 1 to 7 of 7

Math Help - Gaussian vector and variance

  1. #1
    Junior Member
    Joined
    Nov 2009
    Posts
    57

    Gaussian vector and variance

    Let (\epsilon_1,...,\epsilon_{2n-1}) be a random vector with the density:
    p(x_1,...,x_{2n-1})=c_n\exp(-\frac{1}{2}(x_1^2+\Sigma_{i=1}^{2n-2}(x_{i+1}-x_i)^2+x_{2n-1}^2))

    One can check this is a Gaussian vector with mean vector zero and the 2n-1\times 2n-1 correlation matrix has an inverse given by the tridiagonal form:

     \left(\begin{array}{ccccc} 2 & -1 & 0 & ... & 0 \\ -1 & 2 & -1 & .& : \\ 0 & -1 & . &  & 0 \\ : & .& & 2 & -1 \\ 0 & ...& 0 & -1 & 2 \end{array}\right)=M_{2n-1}

    By induction, one can also check that for all n\geq 1, \det(M_n)=n+1 which enables to obtain the normalizing constant c_n=\frac{\sqrt{2n}}{(2\pi)^{\frac{2n-1}{2}}} using the usual formula for the Gaussian density (see for example Grimmett & Stirzaker).

    Now for the question: how do you prove that there is a constant a which does not depend on n such that Var(\epsilon_n)\geq a.n for all n\geq 1?

    By advance, thanks for your help.
    Last edited by akbar; December 19th 2009 at 03:04 PM. Reason: typo
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by akbar View Post
    Let (\epsilon_1,...,\epsilon_{2n-1}) be a random vector with the density:
    p(x_1,...,x_{2n-1})=c_n\exp(-\frac{1}{2}(x_1^2+\Sigma_{i=1}^{2n-2}(x_{i+1}-x_i)^2+x_{2n-1}^2))

    One can check this is a Gaussian vector with mean vector zero and 2n-1\times 2n-1 tridiagonal correlation matrix of the form:

    \left( \begin{array}{ccccc} 2 & -1 & 0 & ... & 0 \\ -1 & 2 & -1 & .& : \\ 0 & -1 & . &  & 0 \\ : & .& & 2 & -1 \\ 0 & ...& 0 & -1 & 2 \end{array}\right)=M_{2n-1}
    (note: for some obscure reason, large parenthesis \left(, \right(, are not recognized by the forum editor...) (works for me ?!; otherwise you can use the environment "pmatrix", it's lighter to use)
    No, this is not the covariance matrix but its inverse. If it was the covariance matrix, you could read the variance from the diagonal coefficients...

    The following way works: integrate the marginals one after another (from both sides toward the middle variable x_n; by symmetry you just have to do one side), do it by hand for the first ones in order to get a pattern suitable for a proof by induction. This way you can resume to studying a real-valued sequence defined by induction (something like u_{n+1}=2-\frac{1}{u_n} where I don't specify what I'm dealing with...), and you need an asymptotic expansion for this sequence. This becomes a calculus question, where various methods apply (you should be able to even get an asymptotic equivalence for the variance).

    I let you try to perform this computation; tell me if you don't succeed.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Nov 2009
    Posts
    57
    Yes it is the inverse, sorry about that. So the coefficient is actually c_n=\frac{\sqrt{2n}}{(2\pi)^\frac{2n-1}{2}}
    I wrote my post too quickly, there is also no syntax issue with parenthesis:
    \left(\begin{array}{cc} 1 & 0 \\ 0 & 1 \end{array}\right)
    Will have a go at the asymptotic expansion. Will keep you posted (so to speak).
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Junior Member
    Joined
    Nov 2009
    Posts
    57
    Having carried the calculation, the strange thing is that I actually end up with an explicit formula for the variance V(\epsilon_n), without the need of a sequence expansion. Here are the details of my calculations (sorry for the trivialities):

    First we isolate each marginal variable in the quadratic form:
    x_1^2+(x_2-x_1)^2=2(x_1-\frac{1}{2}x_2)^2+\frac{1}{2}x_2^2
    We get a similar expression for x_{2n-1}, by symmetry. At each step p, on each side, we get a residual term of the form \frac{1}{p+1}x_{p+1}^2 (resp. \frac{1}{p+1}x_{2n-(p+1)}^2), used for the next step of integration.

    Then, by integrating the marginals and using the symmetry:

    p(x_2,...,x_{2n-2})=\int_{x_1,x_{2n-1}}p(x_1,...,x_{2n-1})dx_1dx_{2n-1}

    =c_n\exp(-\frac{1}{2}(\frac{1}{2}x_2^2+\Sigma_{i=2}^{2n-3}(x_{i+1}-x_i)^2+\frac{1}{2}x_{2n-2}^2)).(\sqrt{2\pi}\sqrt{\frac{1}{2}})^2

    Each (pair of) integration gives us n-1 extra terms of the form:
    2\pi\frac{p}{p+1} for 1\leq p \leq n-1. Their product gives (2\pi)^{n-1}\frac{1}{n}.

    We then get p(x_n)=\frac{\sqrt{2n}}{(2\pi)^\frac{2n-1}{2}}(2\pi)^{n-1}\frac{1}{n}\exp(-\frac{1}{2}(\frac{1}{n}x_n^2+\frac{1}{n}x_n^2))=\f  rac{1}{\sqrt{n\pi}}\exp(-\frac{1}{n}x_n^2)

    V(\epsilon_n)=\frac{1}{\sqrt{n\pi}}\int_{x_n}x_n^2 e^{-\frac{1}{n}x_n^2}dx_n=\frac{n}{\sqrt{\pi}}\int u^2 e^{-u^2}du=\frac{n}{\sqrt<br />
{\pi}}\frac{\sqrt{\pi}}{2}=\frac{n}{2}

    Using a change of variable. Hence the result.
    Too good to be true? Maybe your method is more general, in that case I would appreciate some details.
    Last edited by akbar; December 29th 2009 at 03:14 PM.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by akbar View Post
    Hence the result.
    Too good to be true? Maybe your method is more general, in that case I would appreciate some details.
    This is highly probably correct. I can't remember what I got, but I'm not surprised since actually I had an explicit formula for my sequence as well. Maybe my method would work in a few other cases, but it is not very common to get asymptotic estimates for a sequence defined by induction, so it must not be very important. Your way is way simpler to explain. I did not believe in an explicit formula at first, hence what I did.

    (for the computation of the variance, you could also simply recognize the pdf of a Gaussian of variance \frac{n}{2})
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Junior Member
    Joined
    Nov 2009
    Posts
    57
    Thanks for the confirmation.
    Best wishes for the new year.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Thanks, and happy (upcoming) new year to you too!
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Gaussian vector, again
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: January 4th 2010, 05:06 PM
  2. Gaussian vector and expectation
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: December 15th 2009, 01:02 PM
  3. P.D.F of a norm of sheared Gaussian vector
    Posted in the Advanced Statistics Forum
    Replies: 11
    Last Post: August 4th 2009, 12:53 AM
  4. Variance-covariance matrix of random vector
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: June 15th 2009, 11:36 PM
  5. Vector Gaussian elimination
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: October 17th 2008, 06:56 AM

Search Tags


/mathhelpforum @mathhelpforum