Maximum likelihood function

I'm doing it for a binomial random variable X with one observation X=x. All I want to know is whether I should write my estimator in terms of X or x.

I believe it to be X. If that is so, should my working be in terms of x then at the final stage change x to X.

Thanks

Re: Maximum likelihood function

Quote:

Originally Posted by

**Duke** I'm doing it for a binomial random variable X with one observation X=x. All I want to know is whether I should write my estimator in terms of X or x.

I believe it to be X. If that is so, should my working be in terms of x then at the final stage change x to X.

Thanks

Hi Duke! :)

An estimator is a function that estimates a parameter of your model, which in this case is a binomial distribution that has parameters n and p.

Typically you would use a sample to do the estimation.

What you would have is:

X the random variable

x a specific (possible) observation

the sample mean

the estimator for the parameter p of the binomial distribution.

Re: Maximum likelihood function

Ok but I mean in general would I want the estimator in terms of the random variable or the specific observations. For example you say (correctly) that p= . But if I wanted E(p), I would do . So should it really be ?

Re: Maximum likelihood function

The notation represents the mean of n random variables, but I do not think that is what you intend.

Typically x would represent the sample, and you would write .

Your expectation would be written as: , which is an expectation based on a random variable, of which samples are taken to estimate .

Btw, this expection is equal to the parameter p.

In other words, the choice for the symbol depends on what you're talking about.