# Bound probability of sum of two variables

• Jun 26th 2011, 11:56 AM
g322
Bound probability of sum of two variables
If x1 and x2 are independent, Pr(a<x1<b)=p, Pr(c<x2<d)=p, and y=x1+x2. can we prove that Pr(a+c<y<b+d)>= p? I have proved it if x1 and x2 are normally distributed. Is this also true for general distributions?
• Jun 26th 2011, 12:23 PM
Moo
Re: Bound probability of sum of two variables
Hello,

Consider $\displaystyle A=\{a<X_1<b~,~c<X_2<d\}$ and $\displaystyle B=\{a+c<Y<b+d\}$

We have that $\displaystyle P(A)=P(a<X_1<b)P(c<X_2<d)=p^2$ (by independence).

We also know that $\displaystyle \begin{cases} a<X_1<b \\ c<X_2<d \end{cases}\Rightarrow a+c<Y<b+d$

Then recall that if we have an implication A => B, then $\displaystyle P(A)\leq P(B)$

This lets us write that Pr(a+c<y<b+d)>= pē
• Jun 26th 2011, 05:13 PM
g322
Re: Bound probability of sum of two variables
Pr(a+c<y<b+d)>= pē is too conservative because pē<p. It can be shown that if x1 and x2 are independently and normally distributed, Pr(a+c<y<b+d)>= p. I am wondering if this is also true for other distributions.
• Jun 27th 2011, 06:24 AM
Guy
Re: Bound probability of sum of two variables
Quote:

Originally Posted by g322
Pr(a+c<y<b+d)>= pē is too conservative because pē<p. It can be shown that if x1 and x2 are independently and normally distributed, Pr(a+c<y<b+d)>= p. I am wondering if this is also true for other distributions.

False in general. Take $\displaystyle X, Y$ to be independent exponential rvs and a = c = 0, b = d = 1.

Moo's work suggests to me that his bound is the best one can expect without more assumptions.
• Jun 30th 2011, 02:08 AM
Moo
Re: Bound probability of sum of two variables
Quote:

Originally Posted by Guy
Moo's work suggests to me that his bound is the best one can expect without more assumptions.

There is no doubt about it ! It's my work your talking about, constantly skimming perfection and exactness.

(hey, I'm just kidding)