# \perp sign = independence? (stat inference, notation question)

• Feb 6th 2011, 02:06 AM
Volga
\perp sign = independence? (stat inference, notation question)
I am reading Properties of sample variance, and suddenly this notation comes up with no introductory explanation what it means

$Y_i{\perp}W_i$ as in

$E(Y_iW_i)=0$ since $Y_i{\perp}W_i$ and $E(Y_i)=E(W_i)=0$
(when proving that the sample variance is an unbiased estimator of the population variance)

Then the next chapter said 'the sample mean and the sample variance are independent, $Y{\perp}S^2$

(the Y was supposed to be sample mean, ie with a bar on top, but I couldn't find the way how to place the bar in Latex?)

So my guess is that this sign in statistical inference denotes mutual independence, but I'd like to check.

Thanks!

http://www.psych.umn.edu/faculty/wal...gs/rodgers.pdf

apparently, (linearly) independent is not the same as orthogonal or uncorrelated. So what exactly does that 'perpendicular' sign denotes?
• Feb 6th 2011, 05:34 AM
theodds
It is often used to indicate independence, but I've seen it used to mean uncorrelated (which is more natural IMO, but maybe less used). In the case of vectors, as usual, it can also mean literally orthogonal.
• Feb 6th 2011, 07:11 AM
matheagle
Quote:

Originally Posted by Volga
I am reading Properties of sample variance, and suddenly this notation comes up with no introductory explanation what it means

$Y_i{\perp}W_i$ as in

$E(Y_iW_i)=0$ since $Y_i{\perp}W_i$ and $E(Y_i)=E(W_i)=0$
(when proving that the sample variance is an unbiased estimator of the population variance)

Then the next chapter said 'the sample mean and the sample variance are independent, $Y{\perp}S^2$

(the Y was supposed to be sample mean, ie with a bar on top, but I couldn't find the way how to place the bar in Latex?)

So my guess is that this sign in statistical inference denotes mutual independence, but I'd like to check.

Thanks!

http://www.psych.umn.edu/faculty/wal...gs/rodgers.pdf

apparently, (linearly) independent is not the same as orthogonal or uncorrelated. So what exactly does that 'perpendicular' sign denotes?

\bar infront gives you the sample mean $\bar X$
Thats a short bar, a longer bar is \overline $\overline X$
We use $X\perp Y$ all the time to state indep between two rvs.
• Feb 7th 2011, 02:44 AM
Volga
more on independent rvs- splitting into indep terms
Right. Thanks.

May I ask another - related - question here (not worth a a new thread). From the same proof, to prove that sample variance is an unbiased estimator of the population variance, the author of my study guide uses a 'trick' of splitting the $Y_i-\bar{Y}$ into independent terms (so that's where the question about sign comes from).

It is done as follows:

$Y_i-\bar{Y}=Y_i-\frac{1}{n}\Sigma_{k=1}^nY_k=(Y_i-\frac{1}{n}Y_i)-\frac{1}{n}\Sigma_{k[not.equal]i}Y_k$

I hope you can read that - what is the Latex for 'not equal'??

(And after splitting the sample variance original formula into two parts which are independent, he finds the expected value which, after manipulations, is pop'n variance.)

My question is pretty basic - I don't understand the 'splitting' manipulation above, and especially the meaning of the $\Sigma_{k[not.equal]i}Y_k$. Is that the sum of all $Ys$ except the $Y_i$? Then where is the order of the last term of this sum (I mean n-1 at the top)?

thanks!!
• Feb 7th 2011, 05:06 AM
matheagle
\ne is $\ne$
they're just removing the $Y_i$ term from the sum
There are n-1 terms, but you still have to sum to n, just in case i isn't n.
• Feb 17th 2011, 08:49 PM
Volga
Quote:

Originally Posted by Volga
I am reading Properties of sample variance, and suddenly this notation comes up with no introductory explanation what it means

$Y_i{\perp}W_i$ as in

$E(Y_iW_i)=0$ since $Y_i{\perp}W_i$ and
...
http://www.psych.umn.edu/faculty/wal...gs/rodgers.pdf

apparently, (linearly) independent is not the same as orthogonal or uncorrelated. So what exactly does that 'perpendicular' sign denotes?

Actually, where can I read up about geometric (vector) view of probability, for example, viewing $cov(X,Y)=$ (covariance as inner product on the space of rvs X and Y)?
• Feb 18th 2011, 01:39 PM
theodds
Quote:

Originally Posted by Volga
Actually, where can I read up about geometric (vector) view of probability, for example, viewing $cov(X,Y)=$ (covariance as inner product on the space of rvs X and Y)?

I'm not sure where the best place to get this sort of thing is. Probably reading Papa Rudin is a good start in terms of getting the material about function spaces (obviously you'll need at least as much real analysis as Baby Rudin).
• Feb 18th 2011, 01:49 PM
Volga
I see. Baby Rudin - is it as good as the reviews on the Amazon? what level is it suitable for - is it OK for undergrad level?
• Feb 18th 2011, 02:07 PM
theodds
It should be fine for undergrad. You'll end up going quite a bit off the subject matter you're studying, though, if you plan on making a real analysis excursion (though most of the material is mandatory for measure-theoretic probability). Depending on your mathematical maturity, the difficulty of it could range from challenging to impossible. Rosenlict is a more accessible alternative (it is also only $10, whereas Rudin is$100+ unless you illegally download it or something), but I dislike the way he approaches some topics. The exercises in Rudin are very good, challenging, and help to train your mind so that you can understand digest the more difficult material in the sequel.
• Feb 18th 2011, 02:12 PM
Volga
There is a thread in the Chat about a [easy] book on Analysis, I'll check there too. It seems that all roads lead to analysis ))) I may have less questions popping up here and there after I read an analysis book.