Results 1 to 10 of 10

Math Help - \perp sign = independence? (stat inference, notation question)

  1. #1
    Senior Member
    Joined
    Nov 2010
    From
    Hong Kong
    Posts
    255

    \perp sign = independence? (stat inference, notation question)

    I am reading Properties of sample variance, and suddenly this notation comes up with no introductory explanation what it means

    Y_i{\perp}W_i as in

    E(Y_iW_i)=0 since Y_i{\perp}W_i and E(Y_i)=E(W_i)=0
    (when proving that the sample variance is an unbiased estimator of the population variance)

    Then the next chapter said 'the sample mean and the sample variance are independent, Y{\perp}S^2

    (the Y was supposed to be sample mean, ie with a bar on top, but I couldn't find the way how to place the bar in Latex?)

    So my guess is that this sign in statistical inference denotes mutual independence, but I'd like to check.

    Thanks!

    PS found this in Google
    http://www.psych.umn.edu/faculty/wal...gs/rodgers.pdf

    apparently, (linearly) independent is not the same as orthogonal or uncorrelated. So what exactly does that 'perpendicular' sign denotes?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Senior Member
    Joined
    Oct 2009
    Posts
    340
    It is often used to indicate independence, but I've seen it used to mean uncorrelated (which is more natural IMO, but maybe less used). In the case of vectors, as usual, it can also mean literally orthogonal.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    Quote Originally Posted by Volga View Post
    I am reading Properties of sample variance, and suddenly this notation comes up with no introductory explanation what it means

    Y_i{\perp}W_i as in

    E(Y_iW_i)=0 since Y_i{\perp}W_i and E(Y_i)=E(W_i)=0
    (when proving that the sample variance is an unbiased estimator of the population variance)

    Then the next chapter said 'the sample mean and the sample variance are independent, Y{\perp}S^2

    (the Y was supposed to be sample mean, ie with a bar on top, but I couldn't find the way how to place the bar in Latex?)

    So my guess is that this sign in statistical inference denotes mutual independence, but I'd like to check.

    Thanks!

    PS found this in Google
    http://www.psych.umn.edu/faculty/wal...gs/rodgers.pdf

    apparently, (linearly) independent is not the same as orthogonal or uncorrelated. So what exactly does that 'perpendicular' sign denotes?
    \bar infront gives you the sample mean \bar X
    Thats a short bar, a longer bar is \overline \overline  X
    We use X\perp Y all the time to state indep between two rvs.
    However I never knew the sign in tex until now.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Senior Member
    Joined
    Nov 2010
    From
    Hong Kong
    Posts
    255

    more on independent rvs- splitting into indep terms

    Right. Thanks.

    May I ask another - related - question here (not worth a a new thread). From the same proof, to prove that sample variance is an unbiased estimator of the population variance, the author of my study guide uses a 'trick' of splitting the Y_i-\bar{Y} into independent terms (so that's where the question about sign comes from).

    It is done as follows:

    Y_i-\bar{Y}=Y_i-\frac{1}{n}\Sigma_{k=1}^nY_k=(Y_i-\frac{1}{n}Y_i)-\frac{1}{n}\Sigma_{k[not.equal]i}Y_k

    I hope you can read that - what is the Latex for 'not equal'??

    (And after splitting the sample variance original formula into two parts which are independent, he finds the expected value which, after manipulations, is pop'n variance.)

    My question is pretty basic - I don't understand the 'splitting' manipulation above, and especially the meaning of the \Sigma_{k[not.equal]i}Y_k. Is that the sum of all Ys except the Y_i? Then where is the order of the last term of this sum (I mean n-1 at the top)?

    thanks!!
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor matheagle's Avatar
    Joined
    Feb 2009
    Posts
    2,763
    Thanks
    5
    \ne is \ne
    they're just removing the Y_i term from the sum
    There are n-1 terms, but you still have to sum to n, just in case i isn't n.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Senior Member
    Joined
    Nov 2010
    From
    Hong Kong
    Posts
    255
    Quote Originally Posted by Volga View Post
    I am reading Properties of sample variance, and suddenly this notation comes up with no introductory explanation what it means

    Y_i{\perp}W_i as in

    E(Y_iW_i)=0 since Y_i{\perp}W_i and
    ...
    PS found this in Google
    http://www.psych.umn.edu/faculty/wal...gs/rodgers.pdf

    apparently, (linearly) independent is not the same as orthogonal or uncorrelated. So what exactly does that 'perpendicular' sign denotes?
    Actually, where can I read up about geometric (vector) view of probability, for example, viewing cov(X,Y)=<X,Y> (covariance as inner product on the space of rvs X and Y)?
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Senior Member
    Joined
    Oct 2009
    Posts
    340
    Quote Originally Posted by Volga View Post
    Actually, where can I read up about geometric (vector) view of probability, for example, viewing cov(X,Y)=<X,Y> (covariance as inner product on the space of rvs X and Y)?
    I'm not sure where the best place to get this sort of thing is. Probably reading Papa Rudin is a good start in terms of getting the material about function spaces (obviously you'll need at least as much real analysis as Baby Rudin).
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Senior Member
    Joined
    Nov 2010
    From
    Hong Kong
    Posts
    255
    I see. Baby Rudin - is it as good as the reviews on the Amazon? what level is it suitable for - is it OK for undergrad level?
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Senior Member
    Joined
    Oct 2009
    Posts
    340
    It should be fine for undergrad. You'll end up going quite a bit off the subject matter you're studying, though, if you plan on making a real analysis excursion (though most of the material is mandatory for measure-theoretic probability). Depending on your mathematical maturity, the difficulty of it could range from challenging to impossible. Rosenlict is a more accessible alternative (it is also only $10, whereas Rudin is $100+ unless you illegally download it or something), but I dislike the way he approaches some topics. The exercises in Rudin are very good, challenging, and help to train your mind so that you can understand digest the more difficult material in the sequel.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Senior Member
    Joined
    Nov 2010
    From
    Hong Kong
    Posts
    255
    There is a thread in the Chat about a [easy] book on Analysis, I'll check there too. It seems that all roads lead to analysis ))) I may have less questions popping up here and there after I read an analysis book.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. "sign()" notation
    Posted in the Pre-Calculus Forum
    Replies: 1
    Last Post: August 29th 2011, 04:54 PM
  2. density for order stat, sufficient stat, MLE question
    Posted in the Advanced Statistics Forum
    Replies: 13
    Last Post: March 14th 2011, 12:50 AM
  3. Bootstrap Inference Question
    Posted in the Statistics Forum
    Replies: 4
    Last Post: January 27th 2011, 08:20 AM
  4. A rules of inference question
    Posted in the Discrete Math Forum
    Replies: 3
    Last Post: January 26th 2010, 12:02 PM
  5. prob and stat inference-sets review
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: August 21st 2008, 03:52 PM

Search Tags


/mathhelpforum @mathhelpforum