Okay, the first is just chebyshev's

P(|Y/n-p|>epsilon)=P(|Y-np|>n epsilon)

Use Chebyshev's we have

<V(Y)/(n^2 Epsilon^2)

=(npq)/(n^2 Epsilon^2)

=(pq)/(n Epsilon^2)

which goes to zero as n->infinity

We can quote result someone else asked me last week.

If The variance is finite and the stat is unbiased then it consistent.

Thats what I just did, here.

--------------------------------------------------------------------------------

I'm going to make dinner.

I'll do parts b,c later.

By the way this is not convergence in distribution, this is a stronger mode of convergence, convergence in probability.

Convergence in distribution is where a sequence of cdfs converges to a cdf, like in the central limit theorem.

LOL, see

http://www.statisticalengineering.com/convergence.htm

"Convergence in probability" is not quite the same as convergence in distribution.

this is quite useful...

http://en.wikipedia.org/wiki/Converg...ndom_variables

----------------------------------------------------------------------------------

Part b can be done two ways.

It's basically the same as a, just switch what is a success and a failure

That changes p and q and you're done.

However, let epsilon>0...

P(|(1-Y/n)-(1-p)|>epsilon)

=P(|-Y/n+p|>epsilon)

=P(|Y/n-p|>epsilon) since |-1|=1

=P(|Y-np|>n epsilon)...

the rest is as b4.

-------------------------------------------------------------------------------

Part c follows from...

If x_n->a and y_n->b (both in probability),

then (x_n)(y_n) ->ab, in probability.

I'm trying to prove your problem directly via the triangle inequality.

Let epsilon>0...

P(|(Y/n)(1-Y/n)-p(1-p)|>epsilon)

=P(|(Y/n)-(Y/n)^2-p+p^2)|>epsilon)

=P(|[(Y/n)-p] + [p^2-(Y/n)^2]|>epsilon) now the Triangle

<=P(|(Y/n)-p|>epsilon/2)+P(|(Y/n)^2-p^2|>epsilon/2).

Since the sum is greater than epsilon, at least one of them in abs value must be greater than epsilon/2.

Now we've already shown the first term goes to zero.

The second term goes to zero since the square of a sequence going to p has their terms going to p^2 in probability too.

See, since Y/n->p in probability, the square does too.

In general if x_n->a in prob and g(.) is any continuous function

then g(x_n)->g(a) in prob

That can be found at...

http://planetmath.org/encyclopedia/C...ormations.html