# Classification problem with 0-1 loss

• October 10th 2012, 10:02 AM
Classification problem with 0-1 loss
Given training data $(X_1,Y_1),...,(X_n,Y_n)$ where $X_i \in \mathbb {R}^d,Y_i \in \{ 0,1 \}$ under 0-1 loss.

Let $p=P(Y=1)$, Let $R_{Bayes}$ denote the Bayes risk.

a) Show that $R_{Bayes} \leq min(p,1-p)$

b) Show that equality holds above if X and Y are independent.

c) Exhibit a join distribution where X and Y are not independent but $R_{Bayes} = min(p,1-p)$

Solution so far:

a) Suppose that $f_{Bayes}$ is the Bayes learner, then:

$R_{Bayes} = min EL[Y_k,f(X_k)]= min \sum _{k=1}^n P[f(X_k) \neq Y_k ]$
$\leq \sum _{k=1}^n P(1 \neq Y_k) = 1-P(Y=1)=1-p$

I konw that mine must be wrong because it doesn't help me with the next two problems, please help thanks!
• October 10th 2012, 06:50 PM
chiro
Re: Classification problem with 0-1 loss