In model of a communication channel, we take
the probability p of error to be < 1/2. Why not consider the
case 1 ≥ p > 1/2? What if p = 1/2?
In...
Information theory - Wikipedia, the free encyclopedia
... the concept of binary entropy function is defined as...
$\displaystyle H_{b} (p) = -p\cdot \log_{2} p - (1-p) \cdot \log_{2} (1-p)$ (1)
In (1) $\displaystyle p$ is the probability that a binary transmitted symbol is received 'correct', so that $\displaystyle 1-p$ is the 'error probability'. It is easy to see in (1) that $\displaystyle 0 \le H_{b} (p) \le 1$ and that $\displaystyle H_{b} (p)$ is 'mirror imaging' respect to $\displaystyle p=\frac{1}{2}$ , i.e. is...
$\displaystyle H_{b} (p) = H_{b} (1-p)$ (2)
... so that, from the point of view of the information theory, trasmission channels with $\displaystyle p=1$ and $\displaystyle p=0$ are perfectly equivalent ...
Kind regards
$\displaystyle \chi$ $\displaystyle \sigma$
The condition $\displaystyle p=\frac{1}{2}$ means that You don't trasmit any information through the channel... effectively an equivalent 'binary stream' can be obtained at the receiver with an [unbiased] coin toss ...
Kind regards
$\displaystyle \chi$ $\displaystyle \sigma$
Think of it this way: If I predict whether a coin toss will come up heads or tails with error > 0.5, then you would simply begin using the opposite of my prediction as an (obviously) better prediction.
-Will Dwinnell
Data Mining in MATLAB