# Thread: Prove use of axiom

1. ## Prove use of axiom

(I'm not sure whether this thread belongs in Pre- or the Uni forum)

How can we prove that for a discrete probability distribution $\displaystyle D = <d_1, d_2, ..., d_n>$

$\displaystyle \sum_{i=1}^n P(D=d_i)=1$

using the axioms of probability.

---------

My start:

We know that $\displaystyle 0\le \sum_{i=1}^n P(D=d_i)\le n$ since $\displaystyle 0\le P(D = d_i) \le 1 \quad \forall i$

but then what?

2. It is impossible to answer that question without knowing the complete set of axioms you have been given.
In many textbooks, Introduction to Probability by Laurie Snell, that is in fact part of the definition of probability densities on finite sample spaces.

3. Oh, I see. My textbook lists three axioms:

1. All probabilities are between 0 and 1. For any proposition $\displaystyle a$, $\displaystyle 0\le P(a)\le 1$

2. $\displaystyle P(true)=1, \quad P(false)=0$

3. Probability of a disjunction:

$\displaystyle P(a\vee b)=P(a)+P(b)-P(a\wedge b)$

4. Originally Posted by scorpion007
Oh, I see. My textbook lists three axioms:
1. All probabilities are between 0 and 1. For any proposition $\displaystyle a$, $\displaystyle 0\le P(a)\le 1$
2. $\displaystyle P(true)=1, \quad P(false)=0$
3. Probability of a disjunction:
$\displaystyle P(a\vee b)=P(a)+P(b)-P(a\wedge b)$
This is a shear guess. But the finite space $\displaystyle \Omega = \bigcup\limits_n {\{ d_n \} }$ is made of the union of elementary events.
Noting that $\displaystyle j \ne k \Rightarrow \quad \{ d_j \} \cap \{ d_k \} = \emptyset$ then from axiom 2, $\displaystyle P(d_j \wedge d_k ) = 0$.
Then from axiom 3, $\displaystyle P(d_j \vee d_k ) = P(d_j ) + P(d_k )$
Then by induction we see that $\displaystyle \sum\limits_{k = 1}^n {P(d_k )} = P\left( \Omega \right) = 1$
The last bit is again from axiom 2.