In model of a communication channel, we take
the probability p of error to be < 1/2. Why not consider the
case 1 ≥ p > 1/2? What if p = 1/2?
In...
Information theory - Wikipedia, the free encyclopedia
... the concept of binary entropy function is defined as...
(1)
In (1) is the probability that a binary transmitted symbol is received 'correct', so that is the 'error probability'. It is easy to see in (1) that and that is 'mirror imaging' respect to , i.e. is...
(2)
... so that, from the point of view of the information theory, trasmission channels with and are perfectly equivalent ...
Kind regards
Think of it this way: If I predict whether a coin toss will come up heads or tails with error > 0.5, then you would simply begin using the opposite of my prediction as an (obviously) better prediction.
-Will Dwinnell
Data Mining in MATLAB