Calculating bias with Monte Carlo methods

I currently have a text book problem that I cant seem to work out:

Suppose X_{1},...,X_{n}~Exp(c) and consider the estimate of the variance sigmahat^{2}(X_{1},...,X_{n})=1/n [n][/i=1] (X_{i}-Xbar)^{2}. Assume c is known.

For c=2, n=10 determine a Monte Carlo (MC) estimate of the bias of sigmahat^{2}. Carefully choose the MC sample size N such that the error of the estimate is small compared to the result.

Also, write an algorithm for given c and n that first determines an appropriate N for the estimate of the bias of sigmahat^{2} and then uses this to compute a good estimate of the bias.

Any help would be greatly appreciated!

Re: Calculating bias with Monte Carlo methods

Hey Mullineux.

You should really show us what you have tried, but to give you a hint: you will generate a set of samples (with a fixed sample size) use that to get your parameter estimates, then get a bunch of parameter estimates for your multiple samples and this will give you a distribution for your parameter.

This distribution is essentially your estimator distribution for the parameter you are estimating and you can calculate the bias by finding the expectation of this new distribution and compare it against the actual population value.

It will just be a simple for loop that generates n samples with m simulations in each sample and uses each sample to get a value for your parameter (from the simulation) which will give n final parameter estimates that form a distribution.

I don't know what platform you are using (R, C++, etc) but the implementation for example in R is really simple.

Re: Calculating bias with Monte Carlo methods

How do you feel about this?

bar stool