1. ## finding expected value

Let X_1,....,X_n be iid rv with density f and cumulative distribution function F. Let:

$I_{X1}(a)=1, (\text{if } X_1\leq a) \text{ and}=0, \text{ (otherwise)}$

I want to find the expected value of $I_{X1}(2)$

I can't seems to find the answer since the distribution is unknown to me

2. Originally Posted by noob mathematician
Let X_1,....,X_n be iid rv with density f and cumulative distribution function F. Let:

$I_{X1}(a)=1, (\text{if } X_1\leq a) \text{ and}=0, \text{ (otherwise)}$

I want to find the expected value of $I_{X1}(2)$

I can't seems to find the answer since the distribution is unknown to me
The expected value of an indicator function is just the probability of that event.

Hence $E(I_{X1}(2))=P(X1\le 2)=F_{X1}(2)$

3. Suppose now $p=P(X1\leq 2).$ How to find the mle (maximum likelihood estimate) of p, based on the whole sample.

4. I would think that these are i.i.d. Bernoulli's.

Let $Y_i=I(X_i\le 2)$ for i=1,2,...,n.

Here $p=P(X1\le 2)$ isn't really the point

The MLE of Bernoullis is $\hat p$ the sample mean of the Y's.

$\hat p={\sum_{i=1}^nI(X_i\le 2)\over n}$

Since the Likelihood function is

$L= p^{Y_1}(1-p)^{1-Y_1}p^{Y_2}(1-p)^{1-Y_2}\cdots p^{Y_n}(1-p)^{1-Y_n}$

$= p^{\sum_{i=1}^n Y_i}(1-p)^{n-\sum_{i=1}^n Y_i}$

Now take the log and differentiate TWICE and obtain the sample mean as the MLE.

5. Given the Y_i defined by u:

Let say now I want to know the MLE of p(1-p), is it right for me to conclude that it is $\overline{Y}(1-\overline{Y})$?

6. I believe that is correct under the invariance principle of MLEs.