1. ## Find the MLE

Hi, I'm trying to solve the following problem:

Let Y1, Y2, ..., Yn denote a random sample from the probability density function
$
f(y|\theta)={(\theta + 1)}{y}^{\theta}$
, for $0 < y < 1, \theta > -1,$
and 0 elsewhere.

Find the MLE for $\theta$. Compare the answer to the method-of-moments answer: [2(Y bar) -1]/[1-(Y bar)

Sorry for that last part. Y bar means Y with the line on top.

I tried to do this by taking the lns of each term after multiplying the fy's, taking the derivative with respect to theta, and equating to zero but I end up with - (ln y + 1). Is this possible? I figure it's not since it asks to compare to the M.O.M. answer which seems totally different...

Any ideas?

2. I put that on my exam last week. you need the likelihood function, which is

$L=\prod_{i=1}^n = (\theta +1)^n \left(\prod_{i=1}^n y_i \right)^{\theta}$

you can differentiate this or take the log and differentiate that.

$\ln L=n\ln (\theta +1) +\theta \sum_{i=1}^n \ln y_i$

NOW differentiate and set equal to zero.

I tried to do this by taking the lns of each term after multiplying the fy's, taking the derivative with respect to theta, and equating to zero

That's what I was describing... It seems to have given me a fishy answer.

Thanks for the quick response.

4. What I did next was
$({\theta + 1})^ny^{\theta n}$
$(nln]({\theta + 1}) + {\theta n}ln y$

5. NO, you have n different y's, reread what I wrote.

6. Ohhhh right... Forgot about that log rule... I know once I take the derivative I get:
$
=n/{ (\theta +1)} + \sum_{i=1}^n \ln y_i
$

and that I'm supposed to get (Y bar) from the summation on the right side but how can I get rid of that ln? Because if I take the exp of both sides I'll end up with theta as a exponent... If I put n ln $y_i$ and then ${y_i}^n$ I won't get (Y bar) like I want... and it just occured to me that ${y_i}^n$ doesn't even make sense..

7. Anybody know how I can solve for theta after this step? I can't think of how to get rid of the ln $y_i$

Anybody know how I can solve for theta after this step? I can't think of how to get rid of the ln $y_i$
You don't 'get rid of' it.

Do you know how to solve $0 = \frac{n}{\theta + 1} + B$ for $\theta ?$

You can't get it in terms of $\overline{y}$.

9. Sure I'll end up with:
$\theta = -n/{\sum_{i=1}^n \ln y_i} - 1$
$\theta = n/{\sum_{i=1}^n 1/y_i} - 1$

but I don't see any relation to the M.O.M. result. I was expecting it to be the same or similar...

Sure I'll end up with:
$\theta = -n/{\sum_{i=1}^n \ln y_i} - 1$
$\theta = n/{\sum_{i=1}^n 1/y_i} - 1$ Mr F says: Where has your log gone?
$\theta = n{\sum_{i=1}^n y_i} - 1$ Mr F says: This is not a valid algebraic manipulation.
$\theta = {Y bar} - 1$

but I don't see any relation to the M.O.M. result. I was expecting it to be the same or similar...
..

11. Yeah sorry about the algebra mistake...
I think this is legit:
$n/{(\theta+1)} = -{\sum_{i=1}^n \ln y_i}$
${\theta+1} = -n/{\sum_{i=1}^n \ln y_i}$
${\theta} = -n/{\sum_{i=1}^n \ln y_i} -1$

But I still don't see any connection to the M.O.M. result...

12. The MLE and the MOM are NOT the same.
Here we have newly transformed y's, W=log y