Given training data $\displaystyle (X_1,Y_1),...,(X_n,Y_n) $ where $\displaystyle X_i \in \mathbb {R}^d,Y_i \in \{ 0,1 \} $ under 0-1 loss.

Let $\displaystyle p=P(Y=1) $, Let $\displaystyle R_{Bayes} $ denote the Bayes risk.

a) Show that $\displaystyle R_{Bayes} \leq min(p,1-p) $

b) Show that equality holds above if X and Y are independent.

c) Exhibit a join distribution where X and Y are not independent but $\displaystyle R_{Bayes} = min(p,1-p) $

Solution so far:

a) Suppose that $\displaystyle f_{Bayes} $ is the Bayes learner, then:

$\displaystyle R_{Bayes} = min EL[Y_k,f(X_k)]= min \sum _{k=1}^n P[f(X_k) \neq Y_k ] $

$\displaystyle \leq \sum _{k=1}^n P(1 \neq Y_k) = 1-P(Y=1)=1-p $

I konw that mine must be wrong because it doesn't help me with the next two problems, please help thanks!