# Conditional probability and independence

• Mar 19th 2012, 05:37 PM
godelproof
Conditional probability and independence
I hope someone can help me with this little exercise (Exercise 4.1,Probability theory, Jaynes):

Suppose that we have vectors of events $\displaystyle \{H_1,...,H_n\}$ and $\displaystyle \{D_1,...,D_m\}$ which satisfy:

(1) $\displaystyle P(H_i H_j)=0$ for any $\displaystyle i\neq j$ and $\displaystyle \sum_iP(H_i)=1$

(2) $\displaystyle P(D_rD_s|H_i)=P(D_r|H_i)P(D_s|H_i)$, for any $\displaystyle r\neq s$, $\displaystyle 1\leq i\leq n$

(3) $\displaystyle P(D_rD_s|\overline{H_i})=P(D_r|\overline{H_i})P(D_ s|\overline{H_i})$, for any $\displaystyle r\neq s$, $\displaystyle 1\leq i\leq n$

where $\displaystyle \overline{H_i}$ means the negation of $\displaystyle H_i$.

Prove: If $\displaystyle n>2$, then at most one of the following fractions

$\displaystyle \frac{P(D_1|H_i)}{P(D_1|\overline{H_i})},\frac{P(D _2|H_i)}{P(D_2|\overline{H_i})},...,\frac{P(D_m|H_ i)}{P(D_m|\overline{H_i})}$ can differ from unity, $\displaystyle 1\leq i\leq n$.