Question.

Let $\displaystyle Y_1,...Y_n$ be a random sample from the density function

$\displaystyle f_Y(y;\theta;a)=\left{\begin{array}{cc}{{\theta}a^ {{\theta}y^{-(\theta+1)}},& y{\geq}a,\\ 0 & \mbox{otherwise}\end{array}\right$

where both $\displaystyle a$ and $\displaystyle \theta$ are positive unknown parameters.

i. Find a minimum sufficient statistic for $\displaystyle (a,\theta)$.

ii. Find the maximum likelihood estimator for $\displaystyle a$.

iii. Find the maximum likelihood estimator for $\displaystyle \theta$.

My attempt at the answer.

i. Joint density for a sample $\displaystyle Y_1,...Y_n$:

$\displaystyle f_Y(y;\theta;a)=\theta^n{a}^{n\theta}(\prod_{i=1}^ {n}Y_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(Y_{(1)})$

Now I need to look at the ratio of two densities,

$\displaystyle \frac{f_Y(y;\theta;a)}{f_X(x;\theta;a)}=\frac{\the ta^n{a}^{n\theta}(\prod_{i=1}^{n}Y_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(Y_{(1)})}{\theta^n{a}^{n\theta}(\prod_{ i=1}^{n}X_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(X_{(1)})}=\frac{(\prod_{i=1}^{n}Y_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(Y_{(1)})}{(\prod_{i=1}^{n}Y_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(Y_{(1)})}$

which will be contant as a function of $\displaystyle \theta, a$ if and only if $\displaystyle \frac{\prod_{i=1}^n{Y_i}}{\prod_{i=1}^n{X_i}}=1$, and $\displaystyle \frac{(I_{[a;\infty)}(Y_{(1)})}{(I_{[a;\infty)}(X_{(1)})}=1$

or $\displaystyle \prod_{i=1}^n{Y_i}=\prod_{i=1}^n{X_i}$ and $\displaystyle I_{[a;\infty)}(Y_{(1)})=I_{[a;\infty)}(X_{(1)})$.

Therefore, $\displaystyle (Y_{(1)};\prod_{i=1}^n{Y_i})$ is a minimal sufficient statistic for $\displaystyle (a,\theta)$.

ii and iii. Finding MLE for $\displaystyle a,\theta$

Use the likelihood function derived in (i)

$\displaystyle L_Y(\theta;a;y)=\theta^n{a}^{n\theta}(\prod_{i=1}^ {n}Y_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(Y_{(1)})$

$\displaystyle l_Y(\theta;a;y)=nln{\theta}+n{\theta}lna-(\theta+1)\Sigma_{i=1}^nlnY_i, y{\geq}a$

MLE for $\displaystyle a$.

I note that the likelihood function $\displaystyle L_Y(\theta;a;y)=\theta^n{a}^{n\theta}(\prod_{i=1}^ {n}Y_i)^{-(\theta+1)}\prod_{i=1}^n(I_{[a;\infty)}(Y_{(1)})$, if $\displaystyle a$ is an independent variable and all other variables held constant, $\displaystyle n,\theta>0$, is increasing and does not attain local maximum. Using the condition $\displaystyle y{\geq}a$, I claim that a attains maximum when it equals the minimum order statistic of the sample.

Therefore, can I say that MLE for $\displaystyle a$ is $\displaystyle Y_{(1)}$?

MLE for $\displaystyle \theta$.

I will maximise the log-likelihood function as follows.

First derivative=0

$\displaystyle \frac{\partial}{\partial{\theta}}(l_Y(\theta;a;y)) =\frac{n}{\theta}+n*lna-\Sigma_{i=1}^n{lnY_i}=0, y{\leq}a$.

MLE $\displaystyle \hat{\theta}=\frac{n}{\Sigma_{i=1}^n(lnY_i)-n*lna}$

Second order derivative is negative: $\displaystyle -\frac{n}{\theta^2}<0$ therefore this is maximum.