# Thread: Looking for a better method (convergence in mean square)

1. ## Looking for a better method (convergence in mean square)

Hi,

I'm preparing a lesson with a student, and since he's not a major in maths, I'm looking for a better solution than mine, because it looks too "mathematical"

We have a sample $M=\max(|X_1|,\dots,|X_n|)$ (iid) where Xi follows a uniform distribution over $[-\theta,\theta]$, where $\theta>0$ is a parameter.

Previously, it was asked to find the mean and the variance of Xi. And the cdf of |Xi| (y/theta for y in [0,theta])

Question 3) asks for the cdf of M, which is $G(u)=\frac{u^n}{\theta^n}$, for $u\in[0,\theta]$

It's ok from here.

Then, question 4) asks for the mean of M.
What I did is taking the derivative of G, and then $\mathbb{E}(M)=\int_0^\theta u G'(u) ~du=\dots=\frac{n}{n+1}\cdot \theta$

I'm already concerned that this is a too complicated method... So if you have a better one, denote it (1)

Second part of question 4) asks k such that W=kM is an unbiased estimator for $\theta$
Nothing magic here, $k=\frac{n+1}{n}$

Third part of question 4) asks to show that W converges in mean square.
It is not mentioned that it converges to $\theta$. So how can I explain to the boy that it should converge to theta ?
This is the most awful part, because what I did is to use the property that :
$W$ converges to a constant c $\Longleftrightarrow$ $\lim_{n\to\infty}\mathbb{E}(W)=\theta$ and $\lim_{n\to\infty} \mathbb{V}\text{ar}(W)=0$

But calculating the variance of W is a really ugly... So if you have a better solution, please denote it (2)

And if you think these are the only ways to solve the questions, just eat a T-bone steak for dinner

Thanks

2. I'm not exactly sure what your asking, and I'm almost certainly sure you'll yell at me.

BUT I would ignore chebyshev's and just show that $W_n\to\theta$ as n goes to infinity

That is, just use the definition $E|W_n-\theta|^2\to 0$

And you do need to calculate the moments, but instead of the variance I would just use

$E|W_n-\theta|^2=E(W^2_n)-2\theta E(W_n)+\theta^2$

and show that this approaches zero as n goes to infinity.

3. Thanks, it simplifies a bit to use E|W-theta|²
And no, I won't yell