1. ## finding expected value

Let X_1,....,X_n be iid rv with density f and cumulative distribution function F. Let:

$\displaystyle I_{X1}(a)=1, (\text{if } X_1\leq a) \text{ and}=0, \text{ (otherwise)}$

I want to find the expected value of $\displaystyle I_{X1}(2)$

I can't seems to find the answer since the distribution is unknown to me

2. Originally Posted by noob mathematician
Let X_1,....,X_n be iid rv with density f and cumulative distribution function F. Let:

$\displaystyle I_{X1}(a)=1, (\text{if } X_1\leq a) \text{ and}=0, \text{ (otherwise)}$

I want to find the expected value of $\displaystyle I_{X1}(2)$

I can't seems to find the answer since the distribution is unknown to me
The expected value of an indicator function is just the probability of that event.

Hence $\displaystyle E(I_{X1}(2))=P(X1\le 2)=F_{X1}(2)$

3. Suppose now $\displaystyle p=P(X1\leq 2).$ How to find the mle (maximum likelihood estimate) of p, based on the whole sample.

4. I would think that these are i.i.d. Bernoulli's.

Let $\displaystyle Y_i=I(X_i\le 2)$ for i=1,2,...,n.

Here $\displaystyle p=P(X1\le 2)$ isn't really the point

The MLE of Bernoullis is $\displaystyle \hat p$ the sample mean of the Y's.

$\displaystyle \hat p={\sum_{i=1}^nI(X_i\le 2)\over n}$

Since the Likelihood function is

$\displaystyle L= p^{Y_1}(1-p)^{1-Y_1}p^{Y_2}(1-p)^{1-Y_2}\cdots p^{Y_n}(1-p)^{1-Y_n}$

$\displaystyle = p^{\sum_{i=1}^n Y_i}(1-p)^{n-\sum_{i=1}^n Y_i}$

Now take the log and differentiate TWICE and obtain the sample mean as the MLE.

5. Given the Y_i defined by u:

Let say now I want to know the MLE of p(1-p), is it right for me to conclude that it is $\displaystyle \overline{Y}(1-\overline{Y})$?

6. I believe that is correct under the invariance principle of MLEs.