Results 1 to 8 of 8

Math Help - independent random variables and a uniform distribution...

  1. #1
    Newbie
    Joined
    May 2010
    Posts
    19

    independent random variables and a uniform distribution...

    If X, Y, and Z are independent and they follow a Uniform[0,T] distribution, then what is the probability the largest of the three is larger than the sum of the other two?

    How can we compute the joint density of X+Y and X/Y?


    ---
    I have tried a variety of different things, but am getting answers with random variables in them (for the first part)... I'm not sure if this is okay or not. Thanks!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor chisigma's Avatar
    Joined
    Mar 2009
    From
    near Piacenza (Italy)
    Posts
    2,162
    Thanks
    5

    Re: independent random variables and a uniform distribution...

    Quote Originally Posted by iamthemanyes View Post
    If X, Y, and Z are independent and they follow a Uniform[0,T] distribution, then what is the probability the largest of the three is larger than the sum of the other two?...
    If X has p.d.f. f_{X} (t), Y has p.d.f f_{Y} (t) and Z has p.d.f. f_{Z} (t) then S=X+Y+Z has p.d.f. f_{S}(t) = f_{X} (t)* f_{Y} (t)*f_{Z}(t) where '*' means convolution. If for semplicity sake we suppose T=1, then...

    \mathcal{L} \{f_{X} (t)\} = \mathcal{L} \{f_{Y} (t)\} = \frac{1-e^{-s}}{s} (1)

    \mathcal{L} \{f_{-Z} (t)\} = \frac{e^{s}-1}{s} (2)

    ... so that if S=X+Y-Z then...

    \mathcal{L} \{f_{S} (t)\} = \mathcal{L} \{f_{X} (t)\}\ \mathcal{L} \{f_{Y} (t)\}\ \mathcal{L} \{f_{-Z} (t)\} =

     = \frac{(1-e^{-s})^{2}\ (e^{s}-1)}{s^{3}} = \frac{e^{s} -3 +3\ e^{-s} - e^{-2\ s}}{s^{3}} (3)

    ... so that is...

    f_{S} (t)= \frac{t^{2}}{2}\ \{\mathcad{u} (t+1) -3\ \mathcad{u} (t) + 3\ \mathcad{u} (t-1) - \mathcad{u} (t-2) \} (4)

    ... and the requested probability is...

    P\{S<0\} = \int_{-1}^{0} f_{S}(t)\ dt = \frac{1}{6} (5)

    Kind regards

    \chi \sigma
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6

    Re: independent random variables and a uniform distribution...

    Quote Originally Posted by chisigma View Post
    If X has p.d.f. f_{X} (t), Y has p.d.f f_{Y} (t) and Z has p.d.f. f_{Z} (t) then S=X+Y+Z has p.d.f. f_{S}(t) = f_{X} (t)* f_{Y} (t)*f_{Z}(t) where '*' means convolution. If for semplicity sake we suppose T=1, then...

    \mathcal{L} \{f_{X} (t)\} = \mathcal{L} \{f_{Y} (t)\} = \frac{1-e^{-s}}{s} (1)

    \mathcal{L} \{f_{-Z} (t)\} = \frac{e^{s}-1}{s} (2)

    ... so that if S=X+Y-Z then...

    \mathcal{L} \{f_{S} (t)\} = \mathcal{L} \{f_{X} (t)\}\ \mathcal{L} \{f_{Y} (t)\}\ \mathcal{L} \{f_{-Z} (t)\} =

     = \frac{(1-e^{-s})^{2}\ (e^{s}-1)}{s^{3}} = \frac{e^{s} -3 +3\ e^{-s} - e^{-2\ s}}{s^{3}} (3)

    ... so that is...

    f_{S} (t)= \frac{t^{2}}{2}\ \{\mathcad{u} (t+1) -3\ \mathcad{u} (t) + 3\ \mathcad{u} (t-1) - \mathcad{u} (t-2) \} (4)

    ... and the requested probability is...

    P\{S<0\} = \int_{-1}^{0} f_{S}(t)\ dt = \frac{1}{6} (5)

    Kind regards

    \chi \sigma
    This is completely false. You can't consider that Z is the max of the 3, since there is randomness : the max can vary. A very simple way to notice that is that max(X,Y,Z) has a different distribution than X, Y or Z.
    And who would use this symbol for Laplace transforms in probability ? We'd rather talk about mgf.
    Calculations on inverse Laplace transforms are not common in probability anyway.

    The problem would rather ask for P(\max(X,Y,Z)>X+Y+Z-\max(X,Y,Z))


    By the way : you're giving a full "solution", but you're not explaining anything ! Isn't it the other way round usually ?
    Last edited by Moo; August 1st 2011 at 07:17 AM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor chisigma's Avatar
    Joined
    Mar 2009
    From
    near Piacenza (Italy)
    Posts
    2,162
    Thanks
    5

    Re: independent random variables and a uniform distribution...

    Effectively I didn't read carefully the question and what I computed is the probability P\{Z>X+Y\}. It was requested the probability that the largest variable overcomes the sum of others two, i.e. the quantity...

    P\{Z>X+Y | Z= \text{max}\ (X,Y,Z)\} = \frac{P\{Z>X+Y\}}{P\{Z= \text{max}\ (X,Y,Z)\}} = \frac{\frac{1}{6}}{\frac{1}{3}} = \frac{1}{2} (1)

    Regarding the application of the Laplace Tranform to this particular problem, in fact that only the application of the basic properties that a random variable...

    X= X_{1} + X_{2} + ...+ X_{n} (2)

    ... which is the sum of n independent variables X_{i}\ ,\ i=1,2,...,n with p.d.f. f_{i}(t)\ ,\ i=1,2,...,n, has p.d.f. ...

    f(t)= f_{1}(t)\ *\ f_{2}(t)\ *\ ... *\ f_{n}(t) (3)

    ... where * means convolution. It is well known that the convolution is, when possible, much easier to be performed is s domain instead of in t domain. An example in which this type approach is very efficient is the problem proposed in...

    http://www.mathhelpforum.com/math-he...re-185513.html

    Kind regards

    \chi \sigma
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6

    Re: independent random variables and a uniform distribution...

    Quote Originally Posted by chisigma View Post
    Effectively I didn't read carefully the question and what I computed is the probability P\{Z>X+Y\}. It was requested the probability that the largest variable overcomes the sum of others two, i.e. the quantity...

    P\{Z>X+Y | Z= \text{max}\ (X,Y,Z)\} = \frac{P\{Z>X+Y\}}{P\{Z= \text{max}\ (X,Y,Z)\}} = \frac{\frac{1}{6}}{\frac{1}{3}} = \frac{1}{2} (1)
    I still disagree with this. What makes you think that there is only Z ? There can be an event when the max is Z, and another event when the max is X. That makes your computation incomplete. That is only one third of the desired probability.

    An explanation for this :

    P(max is > to the other two) = P(max is > to the other two , X = max)+P(max is > to the other two , Y = max)+P(max is > to the other two , Z = max)=3P(Z>X+Y | Z=max(X,Y,Z))

    You're not telling how you came up with the pdf of S. Great job with the inverse Laplace transforms, but as I said previously, this is not a common technique we're taught in probability courses.

    And what the heck is "u" in your first post ?
    Sincerely, try to talk in a probabilistic way, your approaches are rather confusing than helping. And let's not talk about the full "solutions" you're always giving.

    Regarding the application of the Laplace Tranform to this particular problem, in fact that only the application of the basic properties that a random variable...
    Use MGF's.

    X= X_{1} + X_{2} + ...+ X_{n} (2)

    ... which is the sum of n independent variables X_{i}\ ,\ i=1,2,...,n with p.d.f. f_{i}(t)\ ,\ i=1,2,...,n, has p.d.f. ...

    f(t)= f_{1}(t)\ *\ f_{2}(t)\ *\ ... *\ f_{n}(t) (3)

    ... where * means convolution.
    Do you feel like repeating this all over again and again ?

    It is well known that the convolution is, when possible, much easier to be performed is s domain instead of in t domain.
    What does it mean ? You make no effort to talk in terms of probability stuff, people are not familiar with "s domain" "t domain" !
    Convolution is not often used in this kind of problem. I computed the pdf of X+Y with the convolution method and it's just an ugly result that one can barely use.

    An example in which this type approach is very efficient is the problem proposed in...

    http://www.mathhelpforum.com/math-he...re-185513.html
    Certainly not.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor chisigma's Avatar
    Joined
    Mar 2009
    From
    near Piacenza (Italy)
    Posts
    2,162
    Thanks
    5

    Re: independent random variables and a uniform distribution...

    Quote Originally Posted by chisigma View Post
    ... regarding the application of the Laplace Tranform to this particular problem, in fact that only the application of the basic properties that a random variable...

    X= X_{1} + X_{2} + ...+ X_{n} (2)

    ... which is the sum of n independent variables X_{i}\ ,\ i=1,2,...,n with p.d.f. f_{i}(t)\ ,\ i=1,2,...,n, has p.d.f. ...

    f(t)= f_{1}(t)\ *\ f_{2}(t)\ *\ ... *\ f_{n}(t) (3)

    ... where * means convolution. It is well known that the convolution is, when possible, much easier to be performed is s domain instead of in t domain. An example in which this type approach is very efficient is the problem proposed in...

    http://www.mathhelpforum.com/math-he...re-185513.html
    Here the text...

    ...suppose X_1, X_2, ... , X_100 are independent random variables with common mean "mu" and variance "sigma squared." Let X be their average. What is the probability that |X - "mu" | is greater than or equal to 0.25?...

    Very well!... in general if n is the number of random variables, the 'average' is by definition...

    X= \frac{X_{1}+ X_{2} + ... + X_{n}}{n} (1)

    ... i.e. their sum devided by n. Now we suppose that the X_{k} are all uniformely distributed between 0 and 1 [so that we examine a particular case, not the general case...] so that is \mu=\frac{1}{2} and \sigma^{2}= \frac{1}{3}. If...

    f_{1}(t) = 1\, \text{if}\ 0<t<1\ ;\ 0\ \text{otherwise} (2)

    ... then ...

    \mathcal{L}\{f_{1}(t)\} = \frac{1-e^{-s}}{s} (3)

    ... so that , neglecting the term n in (1) and indicating with f_{n}(t) the p.d.f of the X, is...

    \mathcal{L}\{f_{n}(t)\} = \frac{(1-e^{-s})^{n}}{s^{n}} (4)

    ... and performing the Inverse Laplace Transform of (4) we derive...

    f_{n} (t)= \frac{1}{(n-1)!}\ \sum_{k=0}^{n} (-1)^{k}\ \binom{n}{k}\ \gamma_{k} (t) (5)

    ... where...

    \gamma_{k}(t)= (t-k)^{n-1}\ \mathcad{u} (t-k) (6)

    Of course for 'large n' the computation of (5) requires a computer and something like fifteen years ago I composed a specific computer program. Using this program we found that for n=100 is...

    P\{|X-50|>25\} = 2\ \int_{0}^{25} f_{100}(t)\ dt \sim 1.79\ 10^{-19} (7)

    Kind regards

    \chi \sigma
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6

    Re: independent random variables and a uniform distribution...

    Quote Originally Posted by chisigma View Post
    Here the text...

    ...suppose X_1, X_2, ... , X_100 are independent random variables with common mean "mu" and variance "sigma squared." Let X be their average. What is the probability that |X - "mu" | is greater than or equal to 0.25?...

    Very well!... in general if n is the number of random variables, the 'average' is by definition...

    X= \frac{X_{1}+ X_{2} + ... + X_{n}}{n} (1)

    ... i.e. their sum devided by n. Now we suppose that the X_{k} are all uniformely distributed between 0 and 1 [so that we examine a particular case, not the general case...] so that is \mu=\frac{1}{2} and \sigma^{2}= \frac{1}{3}. If...

    f_{1}(t) = 1\, \text{if}\ 0<t<1\ ;\ 0\ \text{otherwise} (2)

    ... then ...

    \mathcal{L}\{f_{1}(t)\} = \frac{1-e^{-s}}{s} (3)

    ... so that , neglecting the term n in (1) and indicating with f_{n}(t) the p.d.f of the X, is...

    \mathcal{L}\{f_{n}(t)\} = \frac{(1-e^{-s})^{n}}{s^{n}} (4)

    ... and performing the Inverse Laplace Transform of (4) we derive...

    f_{n} (t)= \frac{1}{(n-1)!}\ \sum_{k=0}^{n} (-1)^{k}\ \binom{n}{k}\ \gamma_{k} (t) (5)

    ... where...

    \gamma_{k}(t)= (t-k)^{n-1}\ \mathcad{u} (t-k) (6)

    Of course for 'large n' the computation of (5) requires a computer and something like fifteen years ago I composed a specific computer program. Using this program we found that for n=100 is...

    P\{|X-50|>25\} = 2\ \int_{0}^{25} f_{100}(t)\ dt \sim 1.79\ 10^{-19} (7)

    Kind regards

    \chi \sigma
    Calling a method efficient when another one can give you the answer in a few lines is absurd. Use the CLT if you know the distribution (or even the first two moments) and go back home with your Laplace transforms.

    You're insisting in using a method that is long, useless and inappropriate.

    Your posts with that method don't even mention how you get the density, no one can know if it's correct or not, unless having studied inverse Laplace transforms, which is certainly not a prerequisite for probability theory at this level.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Flow Master
    mr fantastic's Avatar
    Joined
    Dec 2007
    From
    Zeitgeist
    Posts
    16,948
    Thanks
    5

    Re: independent random variables and a uniform distribution...

    I think the OP now has sufficient information to answer his/her question. If not, s/he can pm me.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Sum of Two Independent Random Variables (uniform)
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: February 4th 2011, 02:58 AM
  2. Independent Uniform Random Variables
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: November 25th 2009, 05:03 AM
  3. Order Statistics, N independent uniform random variables
    Posted in the Advanced Statistics Forum
    Replies: 9
    Last Post: March 22nd 2009, 11:12 PM
  4. Distribution of the sum of two ind. uniform random variables.
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: February 5th 2009, 05:30 PM
  5. uniform distribution of random independent variables
    Posted in the Advanced Statistics Forum
    Replies: 5
    Last Post: December 1st 2008, 04:45 AM

Search Tags


/mathhelpforum @mathhelpforum