# Thread: Uniform distribution

1. ## Uniform distribution

---

I'm not quite sure what part (a) is asking.

For part (b), I'm thinking that this has something to do with Chebyshev's inequality, but the examples for Chebyshev's inequality in my textbook and notes don't involve the use of 'max {...}'. So I'm pretty much stuck. But intuitively, I think it will be 0 when n tends to infinity, but I don't really know how to prove it.

2. I think you'll find this thread interesting. It gives you a possible choice of what $\displaystyle k_n$ should be and there are hints for a proof; this should allow you to understand how $\displaystyle k_n$ must be chosen.

3. Originally Posted by Laurent
I think you'll find this thread interesting. It gives you a possible choice of what $\displaystyle k_n$ should be and there are hints for a proof; this should allow you to understand how $\displaystyle k_n$ must be chosen.
Here's what I've done, but I'm not sure how to continue.

$\displaystyle P(k_n(\theta-Y_n)\leq x)=P(Y_n\geq\theta-\frac{x}{k_n})=1-P(Y_n\leq\theta-\frac{x}{k_n})$

$\displaystyle =1-P(max(X_1,X_2,...,X_n)\leq\theta-\frac{x}{k_n})=1-[P(X_1\leq\theta-\frac{x}{k_n}]^n$

$\displaystyle =1-(1-\frac{x}{\theta k_n})^n$

How do I continue from here? What do they mean when they say $\displaystyle k_n$ is a sequence of constants?

Does it have anything to do with e^-x? If yes, then I guess I'm more or less get it.

4. Originally Posted by knighty
Here's what I've done, but I'm not sure how to continue.

$\displaystyle P(k_n(\theta-Y_n)\leq x)=P(Y_n\geq\theta-\frac{x}{k_n})=1-P(Y_n\leq\theta-\frac{x}{k_n})$

$\displaystyle =1-P(max(X_1,X_2,...,X_n)\leq\theta-\frac{x}{k_n})=1-[P(X_1\leq\theta-\frac{x}{k_n}]^n$

$\displaystyle =1-(1-\frac{x}{\theta k_n})^n$

How do I continue from here? What do they mean when they say $\displaystyle k_n$ is a sequence of constants?

Does it have anything to do with e^-x? If yes, then I guess I'm more or less get it.
What you did is fine. Now you are asked to choose $\displaystyle k_n$ to be any sequence such that $\displaystyle 1-(1-\frac{x}{\theta k_n})^n$ has a limit when $\displaystyle n\to\infty$. There are many possible choices.

The most obvious would be $\displaystyle k_n=n$ or $\displaystyle k_n=\theta n$ (or $\displaystyle k_n=\alpha n$ with an arbitrary $\displaystyle \alpha$). You indeed get $\displaystyle 1-e^{-\frac{x}{\theta}}$ or $\displaystyle 1-e^{-x}$ (or $\displaystyle 1-e^{-\alpha/\theta}$), which is the distribution function of an exponential distribution with some parameter.

What happens for other choices? Suppose for instance $\displaystyle k_n=n^2$ or any sequence with $\displaystyle \frac{k_n}{n}\to\infty$. Then you can see that the limit would be 0, which is not a distribution function.

If to the contrary $\displaystyle \frac{k_n}{n}\to 0$ (still with $\displaystyle k_n\to\infty$), for instance $\displaystyle k_n=\sqrt{n}$, then the limit is 1, which is the distribution function of the Dirac probability measure at 0, i.e. the distribution of a r.v. that is constant, equal to 0. This would be a correct answer, but it is called a "degenerate" limit, which is not very interesting. The best choice is when $\displaystyle k_n$ is on the order of $\displaystyle n$.

Of course, any sequence such that $\displaystyle k_n\sim_n \alpha n$ would give an exponential as the limit, not only $\displaystyle k_n=\alpha n$. But you're only asked for one, so let's choose simple.

5. Okay I get it fully now. Thanks very much.

6. Hi, is there anybody who can help to solve part b?

7. Originally Posted by Questions
Hi, is there anybody who can help to solve part b?
You can compute explicitly $\displaystyle P(|Y_n-\theta|>\varepsilon)$ and then look for the limit. Notice indeed that $\displaystyle P(|Y_n-\theta|>\varepsilon)=P(Y_n<\theta-\varepsilon)$ and you already dealt with such probabilities in question a).