# upperbound on expected value

#### BigCrunsh

Hi

I found a proof, but can not follow. The critical part is the following:

$$\displaystyle \mathbb{E}\left[\frac 1 n \sum_{i=1}^n x_i \left(1-\frac 1 n \sum_{i=1}^n w_i +\left(\frac 1 n \sum_{i=1}^n w_i\right)^2-\left(\frac 1 n \sum_{i=1}^n w_i\right)^3+...\right)\right]-X$$
where
$$\displaystyle \mathbb{E}\left[x_i\right]=X, \mathbb{E}\left[w_i\right]=0$$

I need an approximation / upperbound to n. The authors assert that it is in order of $$\displaystyle n^{-1}$$. Can anyone explain it?

Thanking you in anticipation!

#### SpringFan25

Like, the preceeding line and what it is you are trying to find the expected value of.

#### BigCrunsh

What I'm trying to do is, estimate the bias of a self normalized importance sampler, that is
$$\displaystyle \hat{\mu}=\frac{\sum_{i=1}^n w(z_i) h(z_i)}{\sum_{i=1}^n w(z_i) }$$

for sake of simplicity let's see it as
$$\displaystyle \hat{\mu}=\frac{\sum_{i=1}^n x_i}{\sum_{i=1}^n y_i}$$

the bias is given bei
$$\displaystyle Bias=\mathbb{E}[\hat{\mu}]-X$$

a taylor expansion for $$\displaystyle \frac x y$$ around the expected values X and 1 leads to the given equation above.

I'm interested in an estimation depending on n. there are some books asserting that it is
$$\displaystyle |Bias|\leq \frac C n$$
for some constant $$\displaystyle C \geq 0$$.

Can you give me a more detailed explanation?