Thread: Distribution until the First Increase?

1. Distribution until the First Increase?

Hey,

On a homework I've been given this problem. I don't even know where to start. Is there a name for this distribution or a way I can look up more information about it? Thanks!

----

Let $\displaystyle X_1, X_2,...$ be independent identically distributed continous random variables. We call $\displaystyle n$ the first time of increase if

$\displaystyle X_1>X_2>X_3>...>X_{n-1}<X_n$

Let $\displaystyle N$ be the time until the first increase. Show that $\displaystyle E[N]=e$.

----

The hint we were given was to use the tail sum formula for a random variable taking values in the positive integers. But how do you go from a discrete formula to a continuous one? I am completely lost. What is this distribution called?

Thanks guys!

2. Originally Posted by mbaboy
Hey,

On a homework I've been given this problem. I don't even know where to start. Is there a name for this distribution or a way I can look up more information about it? Thanks!

----

Let $\displaystyle X_1, X_2,...$ be independent identically distributed continous random variables. We call $\displaystyle n$ the first time of increase if

$\displaystyle X_1>X_2>X_3>...>X_(n-1)<X_n$

Let $\displaystyle N$ be the time until the first increase. Show that $\displaystyle E[N]=e$.

----

The hint we were given was to use the tail sum formula for a random variable taking values in the positive integers. But how do you go from a discrete formula to a continuous one? I am completely lost. What is this distribution called?

Thanks guys!
The nice fact is that the distribution of $\displaystyle N$ does not depend on the distribution of the $\displaystyle X_i$'s (provided it is diffuse: $\displaystyle P(X_i=x)=0$ for any $\displaystyle x$).

You should notice that all the orderings of $\displaystyle X_1,\ldots,X_n$ are equally likely. This is just because the r.v. $\displaystyle X_1,\ldots,X_n$ play symmetric roles (you should try to write a full proof). Therefore, there is only 1 chance out of $\displaystyle n!$ to have $\displaystyle X_1>\cdots>X_{n-1}>X_n$, for instance.

Use this to compute $\displaystyle P(N=n)=P(X_1>\cdots>X_{n-1}\mbox{ and }X_{n-1}<X_n)$...

After this (short) computation, the expected value will be straightforward.

3. Ok tell me if I am on the right track. I'm not exactly sure how to compute this.

$\displaystyle P(N=n)=P(X_1>\cdots>X_{n-1}\mbox{ and }X_{n-1}<X_n)$

$\displaystyle =P(X_1>X_2)P(X_2>X_3)P(X_3>X_4)...P(X_{n-2}>X_{n-1})P(X_{N-1}<Xn)$

$\displaystyle =\frac{1}{2} \times \frac{1}{2^2} \times \frac{1}{2^3}...\frac{1}{2^{n-1}}$

Ok clear I make no sense. Maybe if someone could tell me what this evaluates to I could find my way to it? I just don't see how an $\displaystyle e$ can pop out whereas the only two ways to calculate $\displaystyle e$ I know are

$\displaystyle \lim_{k\to\infty}(1+\frac{1}{k})^k$ and $\displaystyle e^z=\sum_{i=1}^\infty \frac{z^i}{i!}$

I'm guessing the first comes up when evaluating an integral to get the expected value. Any help? I have take only a little probability.

4. Originally Posted by mbaboy
Ok tell me if I am on the right track. I'm not exactly sure how to compute this.

$\displaystyle P(N=n)=P(X_1>\cdots>X_{n-1}\mbox{ and }X_{n-1}<X_n)$

$\displaystyle =P(X_1>X_2)P(X_2>X_3)P(X_3>X_4)...P(X_{n-2}>X_{n-1})P(X_{N-1}<Xn)$

$\displaystyle =\frac{1}{2} \times \frac{1}{2^2} \times \frac{1}{2^3}...\frac{1}{2^{n-1}}$

Ok clear I make no sense.
Indeed, the events are not independent, so the probability isn't just the product of the probabilities.

Read again my previous post. It tells that the only property you need is that, for any permutation $\displaystyle \sigma$ of $\displaystyle \{1,\ldots,n\}$, $\displaystyle P(X_{\sigma(1)}>\cdots >X_{\sigma(n)})=\frac{1}{n!}$. This is just because these events have same probability and they partition the probability space (one and only one of these events happens).

Now, you can write the event $\displaystyle \{N=n\}$ using these events: to which orders does it correspond? Each order has probability $\displaystyle \frac{1}{n!}$, so... It is a one-liner but I would like you to find it.

5. Ok so tell me if this makes sense.

$\displaystyle P(N=n)=P(X_1>\cdots>X_{n-1}\mbox{ and }X_{n-1}<X_n)$

so $\displaystyle E(X)=\sum_{n=1}^\infty \frac{n}{n!}=e$

Is that right? Can I do that summation since $\displaystyle n!$ only takes values on the positive integers? I'm just confused because $\displaystyle X_i$ can be defined continuously on all real numbers, yet somehow I am suppose to end up with a summation that evaluates to $\displaystyle e$.

Edit: Ok ok I see what you are saying! What I have above is wrong. The number of orderings is $\displaystyle (n-1)!$. Give me a few minutes with this I think I got it! Many thanks. I guess what I couldn't fathom is that although $\displaystyle X_i$ are continuous, the distribution for $\displaystyle N$ obviously is not.