Does anyone know an online link or website where I can see the derivation of MMEs and MLEs of different discrete and continuous distributions? I am being unable to understand my book.
Thank you
If you have a sample of size n $\displaystyle (k_1,k_2,...,k_n)$ from a distribution with pdf $\displaystyle p_X$ or $\displaystyle (f_Y)$ and paramater $\displaystyle \theta$, you form
$\displaystyle L(\theta)=\prod_{i=1}^np_X(k_i;\theta)$ or a similar form for the continuous case.
The MLE for $\displaystyle \theta, \theta_e$, is any value of $\displaystyle \theta$ which maximizes $\displaystyle L(\theta)$
To maximize something we want to take the derivative and set it equal to zero. Often with MLE's it is easier to take the derivative of the $\displaystyle ln(L(\theta))$.
The method of moments estimators come from matching the sample moments $\displaystyle (\frac{1}{n}\sum_{i=1}^nk_i^r)$ with the theoretical moments $\displaystyle E(X^r)$=function of the parameters.
If you have specific examples of ones you are having trouble with, you can post them and I can show you how to find these estimates. I think you are probably just as capable as anyone else of finding online proofs of these though.