Results 1 to 7 of 7

Math Help - Uniform distribution

  1. #1
    Newbie
    Joined
    Mar 2009
    Posts
    22

    Uniform distribution


    ---

    I'm not quite sure what part (a) is asking.

    For part (b), I'm thinking that this has something to do with Chebyshev's inequality, but the examples for Chebyshev's inequality in my textbook and notes don't involve the use of 'max {...}'. So I'm pretty much stuck. But intuitively, I think it will be 0 when n tends to infinity, but I don't really know how to prove it.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    I think you'll find this thread interesting. It gives you a possible choice of what k_n should be and there are hints for a proof; this should allow you to understand how k_n must be chosen.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Mar 2009
    Posts
    22
    Quote Originally Posted by Laurent View Post
    I think you'll find this thread interesting. It gives you a possible choice of what k_n should be and there are hints for a proof; this should allow you to understand how k_n must be chosen.
    Here's what I've done, but I'm not sure how to continue.

    P(k_n(\theta-Y_n)\leq x)=P(Y_n\geq\theta-\frac{x}{k_n})=1-P(Y_n\leq\theta-\frac{x}{k_n})

    =1-P(max(X_1,X_2,...,X_n)\leq\theta-\frac{x}{k_n})=1-[P(X_1\leq\theta-\frac{x}{k_n}]^n

    =1-(1-\frac{x}{\theta k_n})^n

    How do I continue from here? What do they mean when they say k_n is a sequence of constants?

    Does it have anything to do with e^-x? If yes, then I guess I'm more or less get it.
    Last edited by knighty; April 4th 2009 at 06:35 AM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by knighty View Post
    Here's what I've done, but I'm not sure how to continue.

    P(k_n(\theta-Y_n)\leq x)=P(Y_n\geq\theta-\frac{x}{k_n})=1-P(Y_n\leq\theta-\frac{x}{k_n})

    =1-P(max(X_1,X_2,...,X_n)\leq\theta-\frac{x}{k_n})=1-[P(X_1\leq\theta-\frac{x}{k_n}]^n

    =1-(1-\frac{x}{\theta k_n})^n

    How do I continue from here? What do they mean when they say k_n is a sequence of constants?

    Does it have anything to do with e^-x? If yes, then I guess I'm more or less get it.
    What you did is fine. Now you are asked to choose k_n to be any sequence such that 1-(1-\frac{x}{\theta k_n})^n has a limit when n\to\infty. There are many possible choices.

    The most obvious would be k_n=n or k_n=\theta n (or k_n=\alpha n with an arbitrary \alpha). You indeed get 1-e^{-\frac{x}{\theta}} or 1-e^{-x} (or 1-e^{-\alpha/\theta}), which is the distribution function of an exponential distribution with some parameter.

    What happens for other choices? Suppose for instance k_n=n^2 or any sequence with \frac{k_n}{n}\to\infty. Then you can see that the limit would be 0, which is not a distribution function.

    If to the contrary \frac{k_n}{n}\to 0 (still with k_n\to\infty), for instance k_n=\sqrt{n}, then the limit is 1, which is the distribution function of the Dirac probability measure at 0, i.e. the distribution of a r.v. that is constant, equal to 0. This would be a correct answer, but it is called a "degenerate" limit, which is not very interesting. The best choice is when k_n is on the order of n.

    Of course, any sequence such that k_n\sim_n \alpha n would give an exponential as the limit, not only k_n=\alpha n. But you're only asked for one, so let's choose simple.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Newbie
    Joined
    Mar 2009
    Posts
    22
    Okay I get it fully now. Thanks very much.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Newbie
    Joined
    Apr 2009
    Posts
    2
    Hi, is there anybody who can help to solve part b?
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Questions View Post
    Hi, is there anybody who can help to solve part b?
    You can compute explicitly P(|Y_n-\theta|>\varepsilon) and then look for the limit. Notice indeed that  P(|Y_n-\theta|>\varepsilon)=P(Y_n<\theta-\varepsilon) and you already dealt with such probabilities in question a).
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. [SOLVED] Mixing a uniform distribution with a normal distribution
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: July 8th 2011, 08:27 AM
  2. Not the Uniform Distribution
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: February 24th 2010, 08:24 PM
  3. Uniform Distribution
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: November 23rd 2009, 08:38 PM
  4. Uniform Distribution
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: October 2nd 2009, 04:54 AM
  5. Uniform Distribution!!!
    Posted in the Advanced Statistics Forum
    Replies: 6
    Last Post: November 28th 2007, 05:30 AM

Search Tags


/mathhelpforum @mathhelpforum