# Maximum likelihood estimate

• Jan 7th 2012, 05:04 AM
FRMST
Maximum likelihood estimate
Can you help me with this problem?

Let $\displaystyle {X_1},{X_n},...,{X_n}$, be a random simple sample, where $\displaystyle {X_i}$ is a discrete uniform random variable on $\displaystyle \{ 1,2,3...\theta \}$.

Thus, $\displaystyle P({X_i} = x) = \frac{1}{\theta }$

Find the maximum likelihood estimate for $\displaystyle \theta$, and show it's biased.

--------------

I managed to conclude that the maximum verosimil estimate is:
$\displaystyle {\tilde \theta _{ML}} = \max \{ {X_1},{X_2},...,{X_n}\}$

But I dont know how to show it's biased: since it's discrete random variable, i think i cant derivate.
• Jan 7th 2012, 06:31 AM
SpringFan25
Re: Maximum likelihood estimate
edit made a mistake first time i tried this!

assuming your derived estimator is correct(i haven't checked), it is sufficient to prove:

$\displaystyle E\left(Max(X_1,X_2,X_3....,X_n) \right) < \theta$

it is straightforward to show that
$\displaystyle P\left(\hat{\theta} > \theta\right) =0$
$\displaystyle P\left(\hat{\theta} < \theta\right) >0$

what can you deduce from this about $\displaystyle E(\hat{\theta} - \theta)$?

hint in spoiler
Spoiler:

write
$\displaystyle E(\hat{\theta} - \theta) = P(\hat{\theta} = \theta)\times 0 + P(\hat{\theta} > \theta) \times ( \text{positive term} ) + P(\hat{\theta} < \theta)\times ( \text{negative term} )$
• Jan 7th 2012, 07:00 AM
FRMST
Re: Maximum likelihood estimate
Thanks!
I understand your help, i just have a doubt:

Is it true that:
$\displaystyle E(X)\underbrace = _{??}0 \cdot P(X = 0) + (\alpha > 0) \cdot P(X > 0) + (\beta < 0) \cdot P(X < 0)$

Where X takes values on all real.

I'm not doubting your response is correct or incorrect. I just don't know the veracity of that.

Is there a way to calculate: $\displaystyle E(\max \{ {X_1},...,{X_n}\} )$
• Jan 7th 2012, 10:46 AM
SpringFan25
Re: Maximum likelihood estimate
Quote:

Originally Posted by FRMST
Thanks!
I understand your help, i just have a doubt:

Is it true that:
$\displaystyle E(X)\underbrace = _{??}0 \cdot P(X = 0) + (\alpha > 0) \cdot P(X > 0) + (\beta < 0) \cdot P(X < 0)$

Yes, this is true. It follows from the definition of an expectation. The below is for discrete variables, but generalises to continuous variables too:
$\displaystyle E(X) = \sum^{\infty}_{i=-\infty}P(X=i)i$

$\displaystyle = \left( \sum^{-1}_{i=-\infty}P(X=i)i \right) +\left( P(X=0)\times 0 \right) +\left(\sum^{\infty}_{i=1}P(X=i)i \right)$

Quote:

Is there a way to calculate: $\displaystyle E(\max \{ {X_1},...,{X_n}\} )$
This is should be possible but really isn't necessary to solve the problem you posted.

If you really want to evaluate the expectation you could start by noting that $\displaystyle P(\hat{\theta} \leq k) = P(X_1 \leq k)P(X_2 \leq k)....P(X_n \leq k) = \left(\frac{k}{\theta} \right)^n$

This gives you a probability function, which might lead to an expectation if you manipulate it.
• Jan 7th 2012, 02:17 PM
FRMST
Re: Maximum likelihood estimate
You are right. Now I see why that's true... pretty clever

Quote:

This gives you a probability function, which might lead to an expectation if you manipulate it.
Actually i also got that for the cummulative expression... but i don't know how to manipulate it, since its a discrete random variable...
I cant just differenciate respect k. If that would be continous, the problem would be easy... but im stuck cuz its discrete.
Could you give me an idea to manipulate it?

Thank you so much.
• Jan 8th 2012, 05:28 AM
SpringFan25
Re: Maximum likelihood estimate
you can use this property

$\displaystyle E(X) = \sum_{i=1}^{i=\infty}P(X \geq i)$

(the above property is only valid for random variables which take non-negative integer values).

OR
note that for integer valued RVs: $\displaystyle P(X=k) = P(X \leq k) - P(X \leq K-1)$