Classification problem with 0-1 loss

Given training data where under 0-1 loss.

Let , Let denote the Bayes risk.

a) Show that

b) Show that equality holds above if X and Y are independent.

c) Exhibit a join distribution where X and Y are not independent but

Solution so far:

a) Suppose that is the Bayes learner, then:

I konw that mine must be wrong because it doesn't help me with the next two problems, please help thanks!

Re: Classification problem with 0-1 loss

Hey tttcomrader.

Recall that p >= 0 so the minimum of (p,1-p) will always be less than 1 - p.

As a consideration of all possibilities let p < 1 - p this means p < 0.5. Now consider 1 - p < p This means p > 0.5.

So if 1 - p is not the minimum then p is the minimum, but this is always less than 1 - p so the inequality still holds.