# minimal suff stats, MLE question (struggling)

Printable View

• Feb 15th 2011, 07:47 PM
Volga
minimal suff stats, MLE question (struggling)
Question.

Let $Y_1,...Y_n$ be a random sample from the density function

$f_Y(y;\theta;a)=\left{\begin{array}{cc}{{\theta}a^ {{\theta}y^{-(\theta+1)}},& y{\geq}a,\\ 0 & \mbox{otherwise}\end{array}\right$

where both $a$ and $\theta$ are positive unknown parameters.

i. Find a minimum sufficient statistic for $(a,\theta)$.

ii. Find the maximum likelihood estimator for $a$.

iii. Find the maximum likelihood estimator for $\theta$.

My attempt at the answer.

i. Joint density for a sample $Y_1,...Y_n$:

$f_Y(y;\theta;a)=\theta^n{a}^{n\theta}(\prod_{i=1}^ {n}Y_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(Y_{(1)})$

Now I need to look at the ratio of two densities,

$\frac{f_Y(y;\theta;a)}{f_X(x;\theta;a)}=\frac{\the ta^n{a}^{n\theta}(\prod_{i=1}^{n}Y_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(Y_{(1)})}{\theta^n{a}^{n\theta}(\prod_{ i=1}^{n}X_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(X_{(1)})}=\frac{(\prod_{i=1}^{n}Y_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(Y_{(1)})}{(\prod_{i=1}^{n}Y_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(Y_{(1)})}$

which will be contant as a function of $\theta, a$ if and only if $\frac{\prod_{i=1}^n{Y_i}}{\prod_{i=1}^n{X_i}}=1$, and $\frac{(I_{[a;\infty)}(Y_{(1)})}{(I_{[a;\infty)}(X_{(1)})}=1$

or $\prod_{i=1}^n{Y_i}=\prod_{i=1}^n{X_i}$ and $I_{[a;\infty)}(Y_{(1)})=I_{[a;\infty)}(X_{(1)})$.

Therefore, $(Y_{(1)};\prod_{i=1}^n{Y_i})$ is a minimal sufficient statistic for $(a,\theta)$.

ii and iii. Finding MLE for $a,\theta$

Use the likelihood function derived in (i)

$L_Y(\theta;a;y)=\theta^n{a}^{n\theta}(\prod_{i=1}^ {n}Y_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(Y_{(1)})$

$l_Y(\theta;a;y)=nln{\theta}+n{\theta}lna-(\theta+1)\Sigma_{i=1}^nlnY_i, y{\geq}a$

MLE for $a$.

I note that the likelihood function $L_Y(\theta;a;y)=\theta^n{a}^{n\theta}(\prod_{i=1}^ {n}Y_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(Y_{(1)})$, if $a$ is an independent variable and all other variables held constant, $n,\theta>0$, is increasing and does not attain local maximum. Using the condition $y{\geq}a$, I claim that a attains maximum when it equals the minimum order statistic of the sample.

Therefore, can I say that MLE for $a$ is $Y_{(1)}$?

MLE for $\theta$.

I will maximise the log-likelihood function as follows.

First derivative=0

$\frac{\partial}{\partial{\theta}}(l_Y(\theta;a;y)) =\frac{n}{\theta}+n*lna-\Sigma_{i=1}^n{lnY_i}=0, y{\leq}a$.

MLE $\hat{\theta}=\frac{n}{\Sigma_{i=1}^n(lnY_i)-n*lna}$

Second order derivative is negative: $-\frac{n}{\theta^2}<0$ therefore this is maximum.
• Feb 15th 2011, 10:54 PM
matheagle
why is that sum starting at ZERO for the MLE of theta?

To show that the first order stat is the MLE of a, you need to show that the
likelihood function is increasing in a.
So we want a as large as possible, i.e., the first order stat, it's upper bound.
• Feb 16th 2011, 01:52 AM
Volga
Quote:

Originally Posted by matheagle
why is that sum starting at ZERO for the MLE of theta?

Hmm... don't know... wondering myself... (Wondering) (corrected!!)

Quote:

Originally Posted by matheagle
To show that the first order stat is the MLE of a, you need to show that the
likelihood function is increasing in a.
So we want a as large as possible, i.e., the first order stat, it's upper bound.

Got it. Thanks!

Does it mean that it is SOLVED?