Results 1 to 2 of 2

Math Help - joint density question - math stats

  1. #1
    Newbie
    Joined
    Sep 2009
    Posts
    1

    joint density question - math stats

    mathematical statistics
    let f(x) be a density on R+ (so f(x) < 0 if x < 0). Let g(x,y) = f(x+y)/(x+y), x > 0, y> 0
    a) show g is a density on R^2
    b) assume that the expectation u and variance sigma^2 associated univariate density f exist and that mu^2 does not equal 2sigma^2. Show that X and Y are dependent.


    I have part a done. As for part b I am terribly confused. I first took the question as meaning E[X] = mu and E[Y] = mu. Then I assume that X,Y were independent for a proof by contradiction. So, E[XY] = mu^2 and Var(X+y) = 2sigma^2 (since cov(X,Y) = 0 and also assuming Var(X) = sigma^2 = Var(Y)). Then from the question we would assume that E[XY] does not equal Var(X+Y) from which I got nowhere.

    So I thought I would have to go back and play with the joint density g(x,y) and use the definition of independence for a joint density: g(x,y) = g_Y(y) g_X(x) except I have no clue on how to the integral for either marginal.

    Any hints as to how to attack this problem would be greatly appreciated
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by muskie View Post
    mathematical statistics
    let f(x) be a density on R+ (so f(x) < 0 if x < 0). Let g(x,y) = f(x+y)/(x+y), x > 0, y> 0
    a) show g is a density on R^2
    b) assume that the expectation u and variance sigma^2 associated univariate density f exist and that mu^2 does not equal 2sigma^2. Show that X and Y are dependent.


    I have part a done. As for part b I am terribly confused. I first took the question as meaning E[X] = mu and E[Y] = mu. Then I assume that X,Y were independent for a proof by contradiction. So, E[XY] = mu^2 and Var(X+y) = 2sigma^2 (since cov(X,Y) = 0 and also assuming Var(X) = sigma^2 = Var(Y)). Then from the question we would assume that E[XY] does not equal Var(X+Y) from which I got nowhere.

    So I thought I would have to go back and play with the joint density g(x,y) and use the definition of independence for a joint density: g(x,y) = g_Y(y) g_X(x) except I have no clue on how to the integral for either marginal.

    Any hints as to how to attack this problem would be greatly appreciated
    I think you understood part b) incorrectly. The question should have been, more explicitly: "Let (X,Y) be a r.v. distributed with density g(x,y)=\frac{f(x+y)}{x+y}. Let us assume (on the other hand) that the distribution with density f has mean \mu and variance \sigma^2. Show that X and Y are dependent if \mu^2\neq 2\sigma^2".

    Thus E[X]\neq \mu a priori. But E[X]=\int\int x g(x,y) dx dy.

    Further hints: prove that E[X]=E[Y]=\frac{\mu}{2} and that E[XY]=\frac{1}{6}(\sigma^2+\mu^2), and conclude from there.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Math Stats Question... Need help.
    Posted in the Advanced Statistics Forum
    Replies: 5
    Last Post: March 8th 2011, 10:36 PM
  2. Joint Density function Question
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: January 11th 2011, 11:05 AM
  3. Joint Density Question
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: December 13th 2009, 03:49 AM
  4. Replies: 5
    Last Post: December 5th 2009, 11:30 PM
  5. Replies: 1
    Last Post: November 11th 2009, 05:32 PM

Search Tags


/mathhelpforum @mathhelpforum