# Math Help - error correcting code

1. ## error correcting code

In model of a communication channel, we take
the probability p of error to be < 1/2. Why not consider the
case 1 ≥ p > 1/2? What if p = 1/2?

2. In...

Information theory - Wikipedia, the free encyclopedia

... the concept of binary entropy function is defined as...

$H_{b} (p) = -p\cdot \log_{2} p - (1-p) \cdot \log_{2} (1-p)$ (1)

In (1) $p$ is the probability that a binary transmitted symbol is received 'correct', so that $1-p$ is the 'error probability'. It is easy to see in (1) that $0 \le H_{b} (p) \le 1$ and that $H_{b} (p)$ is 'mirror imaging' respect to $p=\frac{1}{2}$ , i.e. is...

$H_{b} (p) = H_{b} (1-p)$ (2)

... so that, from the point of view of the information theory, trasmission channels with $p=1$ and $p=0$ are perfectly equivalent ...

Kind regards

$\chi$ $\sigma$

3. ## wat

i dont understand what happens when p = 1/2

4. ## right

so u say that H(1/2)=1 reaches its maximum at p=1/2

5. The condition $p=\frac{1}{2}$ means that You don't trasmit any information through the channel... effectively an equivalent 'binary stream' can be obtained at the receiver with an [unbiased] coin toss ...

Kind regards

$\chi$ $\sigma$

6. Originally Posted by amberxinya
In model of a communication channel, we take
the probability p of error to be < 1/2. Why not consider the
case 1 ≥ p > 1/2? What if p = 1/2?
Think of it this way: If I predict whether a coin toss will come up heads or tails with error > 0.5, then you would simply begin using the opposite of my prediction as an (obviously) better prediction.

-Will Dwinnell
Data Mining in MATLAB