1. ## statistics help needed

Let $\displaystyle X1 X2,...Xn$ be a random sample from exponential distribution with parameter $\displaystyle q$ that is $\displaystyle f(x,q)=(1/q)e^{(-x/q)}$ for $\displaystyle x>0$ and $\displaystyle q>0$.

Find the Uniformly Minimum Variance Unbiased Estimator for $\displaystyle P[X<c]$ where $\displaystyle c$ is a known positive constant.

2. The suff stat for q is the sum, $\displaystyle S_n=\sum_{i=1}^nX_i$.
Next we need to find a function of this sum that's unbiased for $\displaystyle P(X<c)=1-e^{-c/q}$.

Since each $\displaystyle X_i\sim\Gamma(1,q)$, we have $\displaystyle S_n\sim\Gamma(n,q)$.

3. Would you help a little more?
How does knowing that $\displaystyle S$~$\displaystyle Gam(n,q)$help me figure out the Uniformly Minimum Variance Estimator?

4. You will need to do the Rao-Blackwell Theorem directly.

Compute $\displaystyle E(I(X_1<c)|S_n=s)$

5. i am sorry to ask you this but what is $\displaystyle I(X1<c)$? Is $\displaystyle I$a funtion?

6. Originally Posted by Kat-M
i am sorry to ask you this but what is $\displaystyle I(X1<c)$? Is $\displaystyle I$a funtion?
I stands for indicator.
It's 1 when that event happens and zero when it doesn't.
$\displaystyle EI(X\in A)=P(X\in A)$.

7. i thought about this for a long time but still i am not sure how to compute $\displaystyle E(I(X1<c)|S=s)$.
X1 is part of S. and i dont know how it affects in computing $\displaystyle \int I(X1<c)f(x1|s)$.
i got $\displaystyle f(x1|s)=\Gamma(n)/s^{n-1}$ So when i integrate it from 0 to c, i got $\displaystyle c\Gamma(n)/s^{n-1}$. did i do it right?

8. Back to $\displaystyle E(I(X_1<c)|S_n=s)$ where s>c.

We need the density $\displaystyle f(x_1|s_n) ={f(x_1) f(s_{n-1}) \over f(s_n) }$, where $\displaystyle s_{n-1}=x_2+\cdots +x_n$.

The density of $\displaystyle X_1$ is $\displaystyle f(x_1)=e^{-x_1/q}/q$.

While $\displaystyle f(s_{n-1})= {s_{n-1}^{n-2} e^{-s_{n-1}/q}\over \Gamma(n-1)q^{n-1}}$ and $\displaystyle f(s_n) ={s_n^{n-1} e^{-s_n/q}\over \Gamma(n)q^n}$

Noting that $\displaystyle x_1+s_{n-1}=s_n$ we have

$\displaystyle f(x_1|s_n) ={(n-1) (x_2+\cdots +x_n)^{n-2} \over (x_1+\cdots +x_n)^{n-1}}$

Thus $\displaystyle P(X_1<c|S_n=s)=\int_0^cf(x_1|S_n=s)dx_1=(n-1)\int_0^c {(s-x_1)^{n-2}dx_1\over s^{n-1}}$

which gives me $\displaystyle 1-\biggl(1-{c\over s}\biggr)^{n-1}$, which doesn't make sense when n=1.

9. Originally Posted by matheagle
Back to $\displaystyle E(I(X_1<c)|S_n=s)$ where s>c.
We need the density $\displaystyle f(x_1|s_n) ={f(x_1) f(s_{n-1}) \over f(s_n) }$, where $\displaystyle s_{n-1}=x_2+\cdots +x_n$.

The density of $\displaystyle X_1$ is $\displaystyle f(x_1)=e^{-x_1/q}/q$.

While $\displaystyle f(s_{n-1})= {s_{n-1}^{n-2} e^{-s_{n-1}/q}\over \Gamma(n-1)q^{n-1}}$ and $\displaystyle f(s_n) ={s_n^{n-1} e^{-s_n/q}\over \Gamma(n)q^n}$

Noting that $\displaystyle x_1+s_{n-1}=s_n$ we have

$\displaystyle f(x_1|s_n) ={(n-1) (x_2+\cdots +x_n)^{n-2} \over (x_1+\cdots +x_n)^{n-1}}$

Thus $\displaystyle P(X_1<c|S_n=s)=\int_0^cf(x_1|S_n=s)dx_1=(n-1)\int_0^c {(s-x_1)^{n-2}dx_1\over s^{n-1}}$

which gives me $\displaystyle 1-\biggl(1-{c\over s}\biggr)^{n-1}$, which doesn't make sense when n=1.
I don't think this is right.
The ideas are good, but the answer seems lame.

Don't mess with guard duck, he's one mean sob.