# Thread: can we use bassu theorem?

1. ## can we use bassu theorem?

Suppose X1,X2,X3,... are a sequence of independent random variables with
uniform distribution on (0, 1). If N = min{n>0 | Xn:n - X1:n>a, 0<a<1 and a is constant}.
Xn:n = max Xi, 1 =< i =<n
X1:n = min Xi, 1 =< i =<n.
What is E[N]?

ps. 1)does that sort of stating the problem implies basu theorem {theorem stating the independence of a complete sufficient statistic and an ancillary statistic}or not?
ps. 2)n:n and 1:n are index.

2. Originally Posted by alv
Suppose X1,X2,X3,... are a sequence of independent random variables with

uniform distribution on (0, 1). If N = min{n>0 | Xn:n - X1:n>a, 0<a<1 and a is constant}.
Xn:n = max Xi, 1 =< i =<n
X1:n = min Xi, 1 =< i =<n.
What is E[N]?

ps. 1)does that sort of stating the problem implies basu theorem {theorem stating the independence of a complete sufficient statistic and an ancillary statistic}or not?

ps. 2)n:n and 1:n are index.
Xn:n and X1:n are random variables so I'm not quite sure how to interpret the statement Xn:n - X1:n>a ....? Do you mean that Pr(Xn:n - X1:n>a) is greater than some value ....?

Be that as it may, the following might help ...... I'm changing the notation to represent Xn:n by $X_{(n)}$ and X1:n by $X_{(1)}$.

Let $X_1, \, X_2, \, .... \, X_n$ be a sequence of i.i.d. random variables with pdf denoted by f(x), cdf denoted by F(x) and order statistics $X_{(1)}, \, X_{(2)}, \, .... \, X_{(n)}$.

It can be shown that the pdf of the interval $W_{rs} = X_{(s)} - X_{(r)}$ (where $1 \leq r < s \leq n$ ) is given by

$f(w_{rs}) = \frac{n!}{(r-1)! (s-r-1)! (n-s)!}$ $\int_{-\infty}^{+\infty} [ F(x) ]^{r-1} f(x) \, [ F(x + w_{rs}) - F(x) ]^{s-r-1} \, f(x + w_{rs}) \, [1 - F(x + w_{rs}) ]^{n-s} \, dx$ .... (1).

When r = 1 and s = n the interval $W_{rs}$ becomes the range W and equation (1) reduces to:

$f(w) = n (n-1) \, \int_{-\infty}^{+\infty} f(x) \, [ F(x + w) - F(x) ]^{n-2} \, f(x + w) \, dx$ .... (2).

When $X_i$ ~ $U(0, 1)$, f(x) = 1 for $0 \leq x \leq 1$ and zero elsewhere and equation (2) becomes:

$f(w) = n (n-1) \, \int_{0}^{1-w} (1) \, [ x + w - x ]^{n-2} \, (1) \, dx = n (n-1) \, \int_{0}^{1-w} w^{n-2} dx$

where the upper integral terminal is because f(x + w) = 0 for $x \geq 1 - w$. Therefore the pdf of the range of n numbers selected at random from the interval (0, 1) is:

$f(w) = n(n-1)w^{n-2} (1 - w) ~ , 0 \leq w \leq 1$ .... (3).

Note by the way the interesting result that $E(W) = \frac{n-1}{n+1}$ and E(W) = 1 in the limit n --> oo ....