Maximum likelihood function

I'm doing it for a binomial random variable X with one observation X=x. All I want to know is whether I should write my estimator in terms of X or x.

I believe it to be X. If that is so, should my working be in terms of x then at the final stage change x to X.

Thanks

Re: Maximum likelihood function

Quote:

Originally Posted by

**Duke** I'm doing it for a binomial random variable X with one observation X=x. All I want to know is whether I should write my estimator in terms of X or x.

I believe it to be X. If that is so, should my working be in terms of x then at the final stage change x to X.

Thanks

Hi Duke! :)

An estimator is a function that estimates a parameter of your model, which in this case is a binomial distribution that has parameters n and p.

Typically you would use a sample to do the estimation.

What you would have is:

X the random variable

x a specific (possible) observation

$\displaystyle \bar x$ the sample mean

$\displaystyle \hat p = {\bar x \over n}$ the estimator for the parameter p of the binomial distribution.

Re: Maximum likelihood function

Ok but I mean in general would I want the estimator in terms of the random variable or the specific observations. For example you say (correctly) that p= $\displaystyle {\bar x \over n}$. But if I wanted E(p), I would do $\displaystyle E({\bar X \over n})$. So should it really be $\displaystyle \hat p =({\bar X \over n})$?

Re: Maximum likelihood function

The notation $\displaystyle \bar X$ represents the mean of n random variables, but I do not think that is what you intend.

Typically x would represent the sample, and you would write $\displaystyle \hat p(x) = {\bar x \over n}$.

Your expectation would be written as: $\displaystyle E(\hat p) = E[\hat p(X)] = E[X / n]$, which is an expectation based on a random variable, of which samples are taken to estimate $\displaystyle \hat p$.

Btw, this expection is equal to the parameter p.

In other words, the choice for the symbol depends on what you're talking about.