## Conditional probability and independence

I hope someone can help me with this little exercise (Exercise 4.1,Probability theory, Jaynes):

Suppose that we have vectors of events $\{H_1,...,H_n\}$ and $\{D_1,...,D_m\}$ which satisfy:

(1) $P(H_i H_j)=0$ for any $i\neq j$ and $\sum_iP(H_i)=1$

(2) $P(D_rD_s|H_i)=P(D_r|H_i)P(D_s|H_i)$, for any $r\neq s$, $1\leq i\leq n$

(3) $P(D_rD_s|\overline{H_i})=P(D_r|\overline{H_i})P(D_ s|\overline{H_i})$, for any $r\neq s$, $1\leq i\leq n$

where $\overline{H_i}$ means the negation of $H_i$.

Prove: If $n>2$, then at most one of the following fractions

$\frac{P(D_1|H_i)}{P(D_1|\overline{H_i})},\frac{P(D _2|H_i)}{P(D_2|\overline{H_i})},...,\frac{P(D_m|H_ i)}{P(D_m|\overline{H_i})}$ can differ from unity, $1\leq i\leq n$.