1. ## Uniform distribution

---

I'm not quite sure what part (a) is asking.

For part (b), I'm thinking that this has something to do with Chebyshev's inequality, but the examples for Chebyshev's inequality in my textbook and notes don't involve the use of 'max {...}'. So I'm pretty much stuck. But intuitively, I think it will be 0 when n tends to infinity, but I don't really know how to prove it.

2. I think you'll find this thread interesting. It gives you a possible choice of what $k_n$ should be and there are hints for a proof; this should allow you to understand how $k_n$ must be chosen.

3. Originally Posted by Laurent
I think you'll find this thread interesting. It gives you a possible choice of what $k_n$ should be and there are hints for a proof; this should allow you to understand how $k_n$ must be chosen.
Here's what I've done, but I'm not sure how to continue.

$P(k_n(\theta-Y_n)\leq x)=P(Y_n\geq\theta-\frac{x}{k_n})=1-P(Y_n\leq\theta-\frac{x}{k_n})$

$=1-P(max(X_1,X_2,...,X_n)\leq\theta-\frac{x}{k_n})=1-[P(X_1\leq\theta-\frac{x}{k_n}]^n$

$=1-(1-\frac{x}{\theta k_n})^n$

How do I continue from here? What do they mean when they say $k_n$ is a sequence of constants?

Does it have anything to do with e^-x? If yes, then I guess I'm more or less get it.

4. Originally Posted by knighty
Here's what I've done, but I'm not sure how to continue.

$P(k_n(\theta-Y_n)\leq x)=P(Y_n\geq\theta-\frac{x}{k_n})=1-P(Y_n\leq\theta-\frac{x}{k_n})$

$=1-P(max(X_1,X_2,...,X_n)\leq\theta-\frac{x}{k_n})=1-[P(X_1\leq\theta-\frac{x}{k_n}]^n$

$=1-(1-\frac{x}{\theta k_n})^n$

How do I continue from here? What do they mean when they say $k_n$ is a sequence of constants?

Does it have anything to do with e^-x? If yes, then I guess I'm more or less get it.
What you did is fine. Now you are asked to choose $k_n$ to be any sequence such that $1-(1-\frac{x}{\theta k_n})^n$ has a limit when $n\to\infty$. There are many possible choices.

The most obvious would be $k_n=n$ or $k_n=\theta n$ (or $k_n=\alpha n$ with an arbitrary $\alpha$). You indeed get $1-e^{-\frac{x}{\theta}}$ or $1-e^{-x}$ (or $1-e^{-\alpha/\theta}$), which is the distribution function of an exponential distribution with some parameter.

What happens for other choices? Suppose for instance $k_n=n^2$ or any sequence with $\frac{k_n}{n}\to\infty$. Then you can see that the limit would be 0, which is not a distribution function.

If to the contrary $\frac{k_n}{n}\to 0$ (still with $k_n\to\infty$), for instance $k_n=\sqrt{n}$, then the limit is 1, which is the distribution function of the Dirac probability measure at 0, i.e. the distribution of a r.v. that is constant, equal to 0. This would be a correct answer, but it is called a "degenerate" limit, which is not very interesting. The best choice is when $k_n$ is on the order of $n$.

Of course, any sequence such that $k_n\sim_n \alpha n$ would give an exponential as the limit, not only $k_n=\alpha n$. But you're only asked for one, so let's choose simple.

5. Okay I get it fully now. Thanks very much.

6. Hi, is there anybody who can help to solve part b?

7. Originally Posted by Questions
Hi, is there anybody who can help to solve part b?
You can compute explicitly $P(|Y_n-\theta|>\varepsilon)$ and then look for the limit. Notice indeed that $P(|Y_n-\theta|>\varepsilon)=P(Y_n<\theta-\varepsilon)$ and you already dealt with such probabilities in question a).