What is your definition of conditional expectation? Intuitively, the conditional expectation of

given

is the function of

which is "closest" to

. In the case of

,

is itself a function of

, and it is of course the "closest" to

, so that

. The formal proof shouldn't be difficult, whatever definition you have; give us the definition you have if you want a more precise solution. (In fact, there's only one definition, but it may well be that you know a definition relative to a particular case, that's why I don't give the general answer yet)

The usual integration by part works if you can differentiate

, which is only possible if the distribution of

is continuous.

In fact there's a more general integration by parts formula for functions with bounded variation, but you probably don't know it.

Then for the general case, you can procede as follows: write

(this is an indicator function in the expectation, it equals 1 on the event in the subscript, and 0 otherwise), and then by Fubini

(we integrate 1 when

and 0 otherwise).

Remark: if

is discrete, then

has "steps", so that the integral could be rewritten as a sum.

Other remark: you wondered if

when

. This is true if

. Indeed, we have

(this may be called Markov inequality, slightly refined), and by the bounded convergence theorem the right-hand side converges to 0.