Results 1 to 15 of 15

Math Help - Problems with multiple variables

  1. #1
    Newbie
    Joined
    Jun 2008
    Posts
    24

    Problems with multiple variables

    Let X1, X2, X3, be random variables (assume independance) with values 1 and 3 and have probability 1/2

    Suppose Y1 = X2 x X3 + X1 and Y2 = X1 x X2 - X3

    Calculate the joint probability density function fx,y (y1,y2) also the mean, variance and covariance, marginal pdf of (y1,y2)
    also conditional distribution and conditional mean of Y1 given Y2=0

    Its the first part that I struggle with the most, im just really not sure what its suppose to mean, do all the variables X1, X2, X3 have probabilty 1/2? does that mean Y1 = 1 and Y2 = -0.25?
    also what does the 'with values 1 and 3 bit mean?'
    I dont know what to make of the joint p.d.f either, f(x,y) (xj,yj) = P(X=xj,Y=yj) and the marginal pdf looks scary from here but im sure will make more sence once I get a handle on things.
    Last edited by SilenceInShadows; July 15th 2008 at 05:42 AM. Reason: More info
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Flow Master
    mr fantastic's Avatar
    Joined
    Dec 2007
    From
    Zeitgeist
    Posts
    16,948
    Thanks
    5
    Quote Originally Posted by SilenceInShadows View Post
    Let X1, X2, X3, be random variables (assume independance) with values 1 and 3 and have probability 1/2

    Suppose Y1 = X2 x X3 + X1 and Y2 = X1 x X2 - X3

    Calculate the joint probability density function fx,y (y1,y2) also the mean, variance and covariance, marginal pdf of (y1,y2)
    also conditional distribution and conditional mean of Y1 given Y2=0

    Its the first part that I struggle with the most, im just really not sure what its suppose to mean, do all the variables X1, X2, X3 have probabilty 1/2? does that mean Y1 = 1 and Y2 = -0.25?
    also what does the 'with values 1 and 3 bit mean?'
    I dont know what to make of the joint p.d.f either, f(x,y) (xj,yj) = P(X=xj,Y=yj) and the marginal pdf looks scary from here but im sure will make more sence once I get a handle on things.
    My interprettation would be that \Pr(X_i = 1) = \frac{1}{2} and \Pr(X_i = 3) = \frac{1}{2} for i = 1, 2, 3.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Member
    Joined
    Jul 2008
    Posts
    138
    I think Mr F. is exactly right. I think the wording is poor but I think there is really only 1 interpretation.

    Let X1, X2, X3, be random variables (assume independance) with values 1 and 3 and have probability 1/2
    Another way of stating it might be something like this.
    Let X1, X2, X3, be random variables (assume independence). Each one can take on values of either 1 or 3 with equal probability (i.e. 1/2).

    Is that better?

    As far as the joint PDF, etc is concerned, probably the easiest way to go on this is to make a table with all the possible outcomes of X_1,X_2,X_3,p,Y_1,Y_2 where p is the probability of the combination of X_1,X_2,X_3.


    For example, the first row might be 1,1,1,(1/2)^3,2,0. If you fill out the table completely (should have 2^3 rows) then you can add up the probabilities p where Y_1=A\text{ and }Y_2=B for all possible combinations of A\text{ and }B that you see in the Y_1,Y_2 columns
    Last edited by meymathis; July 15th 2008 at 07:35 AM. Reason: Added stuff on calculating joint pdf
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Newbie
    Joined
    Jun 2008
    Posts
    24
    Thats more helpful yes thankyou, how do I go about applying that to the question? I dont understand how to apply the information to the formulas that describe the joint pdf etc...

    It also tells me to display my answer from the joint pdf in a table.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Member
    Joined
    Jul 2008
    Posts
    138
    I edited my previous post with the info that you are asking for, I think.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Newbie
    Joined
    Jun 2008
    Posts
    24
    Am I to put the possible combinations of 1 and 3 in the first 3 columns as the rows so to speak? the probability will stay fix but the values of Y1 and Y2 will change?

    1 1 1 2 0
    1 1 3 4 -2
    1 3 1 4 2
    1 3 3 10 0
    3 1 1 4 2
    3 1 3 6 0
    3 3 1 6 8
    3 3 3 12 6

    OK, so that is what I make the table to be. Grouping the same Y1 Y2 values together gives:

    2 0 0.125
    4 -2 0.125
    10 0 0.125
    4 2 0.25
    6 0 0.125
    6 8 0.125
    12 6 0.125

    would that be an answer?
    Last edited by SilenceInShadows; July 15th 2008 at 09:09 AM.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Member
    Joined
    Jul 2008
    Posts
    138
    Yep!
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Newbie
    Joined
    Jun 2008
    Posts
    24
    Ok, the next step is the marginal pdf for Y1 and Y2 which are given by this scary formula:
    P(Y1 = y1i ) = P(Y1 = y1, -infinity < Y2 < infinity = Sum P(Y1 = y1i, Y2 = y2i)
    (I really must learn that maths type font thing...)

    Erm yes, I really dont get the infinities thing, Is it asking to sum all posible values of Y2 and calculate thier probability or something.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Member
    Joined
    Jul 2008
    Posts
    138
    Quote Originally Posted by SilenceInShadows View Post
    Ok, the next step is the marginal pdf for Y1 and Y2 which are given by this scary formula:
    P(Y1 = y1i ) = P(Y1 = y1, -infinity < Y2 < infinity = Sum P(Y1 = y1i, Y2 = y2i)
    (I really must learn that maths type font thing...)

    Erm yes, I really dont get the infinities thing, Is it asking to sum all posible values of Y2 and calculate their probability or something.
    Don't be afraid of infinity, it's just an 8 that fell over. Ok, bad joke. Basically when it appears as a bounds (upper or lower) in a sum or integral then it just means unbounded.

    Another way of writing the same thing is:

    \mathbf{P}(Y_1 = y1) = \sum_{y2_i\in \mathbf{Y_2}}\mathbf{P}(Y_1 = y1 \text{ and }Y_2 = y2_i) where \mathbf{Y_2} is the set of all possible outcomes of Y_2

    Another way of saying it is that to find the \mathbf{P}(Y_1 = y1) all you have to do is add up all the probabilities of the outcomes from the joint PDF in which Y_1 = y1.

    By the way, to see the code that created these formulas, all you have to do is click on one. It's basically just Latex.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Newbie
    Joined
    Jun 2008
    Posts
    24
    Im still quite confused about what its asking me to do, if some body could take the time to explain further that would be much appriated.

    Also if anybody can recommend a book/link me to any worked through examples then that would be so amazingly helpful.

    Shadow.
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Member
    Joined
    Jul 2008
    Posts
    138
    Lets start with:

    <br />
\mathbf{P}(Y_1 = y1) = \sum_{y2_i\in \mathbf{Y_2}}\mathbf{P}(Y_1 = y1 \text{ and }Y_2 = y2_i)<br />

    First a comment about marginal distributions. Really what this is saying is that you (or your teacher or book) want to calculate a probability distribution for a random variable, and you are to calculate it from a joint PDF. That's it. You just want to calculate the PDF for, in this case Y_1. (also for Y_2)
    You calculated the joint pdf in the table:
    2 0 0.125
    4 -2 0.125
    10 0 0.125
    4 2 0.25
    6 0 0.125
    6 8 0.125
    12 6 0.125

    Let's start with the marginal PDF of Y_1
    So what are the outcomes of Y_1? Ans: 2,4,6,10,12

    \Pr(Y_1=2) = 0.125
    \Pr(Y_1=4) = \Pr(Y_1=4\text{ and }Y_2=-2)+\Pr(Y_1=4\text{ and }Y_2=2)=0.25+0.125 = 0.375
    and so on.

    You get the following table:
    Y_1\ |\  \Pr(Y_1=y_1)
    2, 0.125
    4, 0.375
    6, 0.25
    10, 0.125
    12, 0.125

    Try doing the Y_2 case.

    Make better sense?

    Now the conditional distributions are a little different. Conditionals say, given you know that Y_2=y_2, what is the distribution of Y_1. This the conditional distribution of Y_1 conditioned on Y_2.

    More specifically
    \Pr(Y_1=y_1|Y_2=y_2)=\Pr(Y_1=y_1\text{ and }Y_2=y_2)/\Pr(Y_2=y_2)\hspace{1cm}(1)

    Here are the columns for the table:
    Y_2\ |\  \Pr(Y_1=2|Y_2) \ |\  \Pr(Y_1=4|Y_2)\ |\  \Pr(Y_1=6|Y_2) \ |\  \Pr(Y_1=10|Y_2)\ |\  \Pr(Y_1=12|Y_2)

    Let's do two rows in detail:
    for Y_2=-2 The denominator of (1) for the first row is \Pr(Y_2 = -2) = 0.125 which I got from the (marginal) distribution of Y_2. The \Pr(Y_1 = 4 \text{ and } Y_2=-2) = 0.125 with all the other possible values of Y_1 having probability 0. So \Pr(Y_1=4|Y_2=-2)=0.125/0.125 = 1 with the rest being 0.
    -2, 0, 1, 0, 0, 0
    Next row:
    for Y_2=0 The denominator of (1) for this row is \Pr(Y_2 = 0) = 0.375 from the (marginal) distribution of Y_2. The \Pr(Y_1 = 2 \text{ and } Y_2=0) = \Pr(Y_1 = 6 \text{ and } Y_2=0)=\Pr(Y_1 = 10 \text{ and } Y_2=0)=0.125, with all the other possible values of Y_1 having probability 0. So \Pr(Y_1=3|Y_2=0)=0.125/0.375 = 1/3, and so on.

    0, 1/3, 0 1/3, 1/3, 0 [note that sum of the probabilities is 1.]
    2, 0, 1, 0, 0, 0
    6, 0, 0, 0, 0, 1
    8, 0, 0, 1, 0, 0

    Step back a moment, and let's read the info you have tabulated.
    First I ask, what is the probability that Y_1=6? From the (marginal) PDF you can answer 0.25. Now I say, suppose you know that Y_2=0, now what is the probability that Y_1=6? Here you look at the conditional table. Now the answer is 1/3. Notice it is different. Knowing something about Y_2 tells you something about Y_1.

    Make sense?
    Follow Math Help Forum on Facebook and Google+

  12. #12
    Newbie
    Joined
    Jun 2008
    Posts
    24
    It seems to me that the mean of Y1 could be as simple as 6?
    ie
    (2+4+4+4+6+6+10+12)/8?
    making the variance to be 10?
    (4+3x16+2x36+100+144)/8 - 36?
    could some one confirm/correct me as to whether that is the right approach? Also if the question asked me to calcualte the covariance of Y1 and Y2 then do I need to calcualte Cov[Y1, Y2] given by E[(Y1-E[y2])(Y2-E[y1]) ?
    The only remaining parts of the question after this is the conditional mean of Y1 given Y2=0 and if Y1 and y2 are independent or not.

    Many Thanks Silence
    Follow Math Help Forum on Facebook and Google+

  13. #13
    Flow Master
    mr fantastic's Avatar
    Joined
    Dec 2007
    From
    Zeitgeist
    Posts
    16,948
    Thanks
    5
    Quote Originally Posted by SilenceInShadows View Post
    It seems to me that the mean of Y1 could be as simple as 6?
    ie
    (2+4+4+4+6+6+10+12)/8?
    making the variance to be 10?
    (4+3x16+2x36+100+144)/8 - 36?
    could some one confirm/correct me as to whether that is the right approach? [snip]
    Sorry but this is completely wrong because the possible values of Y1 have different probabilities of occuring ......

    If you had a random variable A such that Pr(A = 0) = 0.001 and Pr(A = 10) = 0.999 would you calculate the mean of A to be (0 + 10)/2 = 5 .....?

    You must have met the formula E(X) = \sum_{i=1}^n x_i \, \Pr(X = x_i) \, ....

    It is worrying that you would think this at this level. I think you would be wise to thoroughly review all of the basic definitions and concepts.

    Quote Originally Posted by SilenceInShadows View Post
    [snip]
    Also if the question asked me to calcualte the covariance of Y1 and Y2 then do I need to calcualte Cov[Y1, Y2] given by E[(Y1-E[y2])(Y2-E[y1]) ?
    [snip]
    The correct formula is \text{Cov} (Y_1 , Y_2) = E[(Y_1-E[{\color{red}Y_1}]) \, (Y_2-E[{\color{red}Y_2}])].

    Since this is one of the well known formulas for calculating Cov[Y1, Y2] and since your previous calculations should have given you all of the data required for its use, .......

    Nevertheless, I think you'll find the alternative formula \text{Cov} (Y_1 , Y_2) = E(Y_1 Y_2) - E(Y_1) E(Y_2) more computationally efficient.

    Quote Originally Posted by SilenceInShadows View Post
    [snip]
    The only remaining parts of the question after this is the conditional mean of Y1 given Y2=0 and if Y1 and y2 are independent or not.
    Using the data provided by your previous calculations, I think you should be able to answer these two questions yourself after a thorough review of the relevant basic concepts and definitions.
    Last edited by mr fantastic; July 21st 2008 at 05:41 AM.
    Follow Math Help Forum on Facebook and Google+

  14. #14
    Newbie
    Joined
    Jun 2008
    Posts
    24
    <br />
E(X) = \sum_{i=1}^n x_i \, \Pr(X = x_i) \, ....<br />

    Using the above table:

    2, 0.125
    4, 0.375
    6, 0.25
    10, 0.125
    12, 0.125

    so, I would make that to be:

    2x0.125+4x0.375+6x0.25+10x0.125+12x0.125?

    Ok, so moving on to independent/dependent...

    Events Y1,Y2 are independent iff:

    P(Y1 intersection Y2) = P(Y1)P(Y2)
    I think that they are therefore dependent as there are no events where 2 number appear in at the same time. Therefore the interestion of Y1 and Y2 is 0?
    Last edited by SilenceInShadows; July 21st 2008 at 03:04 PM.
    Follow Math Help Forum on Facebook and Google+

  15. #15
    Flow Master
    mr fantastic's Avatar
    Joined
    Dec 2007
    From
    Zeitgeist
    Posts
    16,948
    Thanks
    5
    Quote Originally Posted by SilenceInShadows View Post
    <br />
E(X) = \sum_{i=1}^n x_i \, \Pr(X = x_i) \, ....<br />

    Using the above table:

    2, 0.125
    4, 0.375
    6, 0.25
    10, 0.125
    12, 0.125

    so, I would make that to be:

    2x0.125+4x0.375+6x0.25+10x0.125+12x0.125?

    Mr F says: Correct.

    Ok, so moving on to independent/dependent...

    Events Y1,Y2 are independent iff:

    P(Y1 intersection Y2) = P(Y1)P(Y2)
    I think that they are therefore dependent as there are no events where 2 number appear in at the same time. Therefore the interestion of Y1 and Y2 is 0?
    To prove dependence on the basis of calculations you've already done, it's sufficient to show that Pr(Y1 = y1 | Y2 = y2) is different from Pr(Y1 = y1) ....
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Correlation with multiple variables
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: September 8th 2011, 11:41 AM
  2. Multiple Random Variables
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: August 1st 2010, 11:25 PM
  3. Chain rule with multiple variables
    Posted in the Differential Geometry Forum
    Replies: 2
    Last Post: November 9th 2009, 08:26 AM
  4. Change of Variables in Multiple Integrals
    Posted in the Calculus Forum
    Replies: 1
    Last Post: July 17th 2008, 09:29 PM
  5. Convergence with multiple variables
    Posted in the Calculus Forum
    Replies: 6
    Last Post: April 6th 2008, 12:13 PM

Search Tags


/mathhelpforum @mathhelpforum