1. Moment estimators?

Specify the moment estimators for $\mu$ and $\sigma^{2}$ for the normal distribution

I've looking over my textbook to see how to do this but all I've found is this:

An alternative form of estimation is accomplished through the method of moments. The method involves equating the population mean and variance to the corresponding sample mean $\bar{x}$ and sample variance $s^2$ and solving for the parameter, the result being the moment estimator.

There are no examples or anything, so I'm not sure of what to do. I've tried looking online but haven't had luck finding any examples that I understand.

2. For a $N(\mu, \sigma^2)$ with both unknown and n>=2, you have:

$\theta(\theta_1, \theta_2) = \theta(\mu,\; \sigma)$

Now, the first moment is:

$\displaystyle m_1 = {m_1}(\theta_1, , \theta_2) = \int_X x \; f(x ; \theta)dx = E_{\theta}[X_{1}] = \mu = \frac{1}{n} \sum_{i=1}^n X_i$

and the second moment is:

$\displaystyle m_2 = {m_2}(\theta_1, , \theta_2) = \int_X x^2 \; f(x ; \theta)dx = E_{\theta}[{X_{1}}^2] = \mu^2+\sigma^2 = \frac{1}{n} \sum_{i=1}^n {X_i}^2$

So, you have:

$\displaystyle \mu = \frac{1}{n} \sum_{i=1}^n X_i$

$\displaystyle \mu^2+\sigma^2 = \frac{1}{n} \sum_{i=1}^n {X_i}^2$

solve these two equations simultaneously for $\mu \;\mbox{and} \; \sigma$ to obtain the estimators.

3. Thanks for the response. After some reading, I get what you're doing. But to reemphasize:

To get moment 1 I do:

$\int x(\frac{1}{\sqrt{\pi}\sigma}e^{\frac{-1}{2\sigma^{2}(x-\mu)^2}})dx$

And for moment 2 I do:

$\int x^{2}(\frac{1}{\sqrt{\pi}\sigma}e^{\frac{-1}{2\sigma^{2}(x-\mu)^2}})dx$

If this is correct, my only problems left are how to go about integrating these and a strategy for simultaneously solving the resulting equations for $\mu$ and $\sigma$

I kind find the resulting functions for the first and second moment all over the Internet, but I can't seem to find any instructions actually computing the integrals to get to those functions.

4. Yes..

The nth moment of a continuous distribution is given by

$\displaystyle \int x^n \; f(x) dx$

$E[X] = \mu$

$Var(X) = \sigma^2 = E[X^2]-(E[X])^2 \implies E[X^2]=\sigma^2+\mu^2$

But to find the estimator here, you have:

$\displaystyle \mu = \dfrac{1}{n} \sum_{i=1}^n X_i = \dfrac{X_1 + X_2 + ...+X_n}{n} = \overline{X}$

so, $\hat{\mu} = \overline{X}$

and,

$\displaystyle \hat{\sigma^2} = \frac{1}{n} \sum_{i=1}^n {X_i}^2 - \mu^2= \frac{1}{n} \sum_{i=1}^n {X_i}^2 -\overline{X^2}=...$