# Thread: Classification problem with 0-1 loss

1. ## Classification problem with 0-1 loss

Given training data $\displaystyle (X_1,Y_1),...,(X_n,Y_n)$ where $\displaystyle X_i \in \mathbb {R}^d,Y_i \in \{ 0,1 \}$ under 0-1 loss.

Let $\displaystyle p=P(Y=1)$, Let $\displaystyle R_{Bayes}$ denote the Bayes risk.

a) Show that $\displaystyle R_{Bayes} \leq min(p,1-p)$

b) Show that equality holds above if X and Y are independent.

c) Exhibit a join distribution where X and Y are not independent but $\displaystyle R_{Bayes} = min(p,1-p)$

Solution so far:

a) Suppose that $\displaystyle f_{Bayes}$ is the Bayes learner, then:

$\displaystyle R_{Bayes} = min EL[Y_k,f(X_k)]= min \sum _{k=1}^n P[f(X_k) \neq Y_k ]$
$\displaystyle \leq \sum _{k=1}^n P(1 \neq Y_k) = 1-P(Y=1)=1-p$

I konw that mine must be wrong because it doesn't help me with the next two problems, please help thanks!

2. ## Re: Classification problem with 0-1 loss

Hey tttcomrader.

Recall that p >= 0 so the minimum of (p,1-p) will always be less than 1 - p.

As a consideration of all possibilities let p < 1 - p this means p < 0.5. Now consider 1 - p < p This means p > 0.5.

So if 1 - p is not the minimum then p is the minimum, but this is always less than 1 - p so the inequality still holds.