1. ## Multivariate Normal

I have a couple questions (maybe all related) and I think they are because my matrix math is weak / rusty, weak and rusty

1) Why is Z = (Covariance Matrix)^(-1/2) * (X_vector - mu_vector) N_p (0,I)?

2) To say Z^2 does not mean (Covariance Matrix)^(-1/2) * (X_vector - mu_vector) * (Covariance Matrix)^(-1/2) * (X_vector - mu_vector). Instead the order changes to:

(X_vector - mu_vector)' * (Covariance Matrix)^(-1/2)*(Covariance Matrix)^(-1/2)* (X_vector - mu_vector).

I understand that for the calculations to work, we need to have the dimensions work, but why is the order the way it is (and how do you know?)

3) Finally trying to show sqrt(n) * (Covariance Matrix)^(-1/2) * (X_vector - mu_vector) is N_p (0,I).

I bet all these show the same deficiency I have but if anyone can help explain I would be grateful. Thanks!

Brian

2. Hello,

I'm quite skeptical about your notations... What is "(covariance matrix)^(-1/2)" ? Is it the determinant of the covariance matrix ?
In this case, I don't see where
Z = (Covariance Matrix)^(-1/2) * (X_vector - mu_vector) N_p (0,I)
comes from

2) To say Z^2 does not mean (Covariance Matrix)^(-1/2) * (X_vector - mu_vector) * (Covariance Matrix)^(-1/2) * (X_vector - mu_vector). Instead the order changes to:

(X_vector - mu_vector)' * (Covariance Matrix)^(-1/2)*(Covariance Matrix)^(-1/2)* (X_vector - mu_vector).
But Z^2 doesn't follow a normal distribution ??
Also, what is the square of a vector ?

3) Finally trying to show sqrt(n) * (Covariance Matrix)^(-1/2) * (X_vector - mu_vector) is N_p (0,I).
It's rather "goes to N_p (0,I)" as n goes to infinity ?
But still, I don't agree with that formula...

It looks like the multidimensional central limit theorem. And unlike the unidimensional theorem, we cannot divide by the "standard deviation" (that you would have interpreted as (Covariance matrix)^(-1/2)) to get a "standard" normal distribution...

Where did you take these formulae from ?

3. Hello.

First: (covariance matrix)^(-1/2) is the inverse of the square root matrix of the covariance matrix. I dont know how to type in this forum with symbols. When I type N_p(O,I) this is a multivariate normal distribution in p variables with mean = the zero vector and variance equal to the Identity matrix.

Second: The book shows shows Z^2 =
(X_vector - mu_vector)' * (Covariance Matrix)^(-1/2)*(Covariance Matrix)^(-1/2)* (X_vector - mu_vector). And then notes it is distributed chi-square.

where Z is (Covariance Matrix)^(-1/2) * (X_vector - mu_vector) N_p (0,I)

My question is how do you know the order to set up the multiplication to get Z^2?

Third : Sorry, I mistyped! Should be:

3) Finally trying to show sqrt(n) * (Covariance Matrix)^(-1/2) * (SampleMean_vector - mu_vector) is N_p (0,I).

Applied Multivariate Statistical Analysis by Johnson and Wichern

Are you able to help?

4. Originally Posted by B_Miner
Hello.

First: (covariance matrix)^(-1/2) is the inverse of the square root matrix of the covariance matrix. I dont know how to type in this forum with symbols. When I type N_p(O,I) this is a multivariate normal distribution in p variables with mean = the zero vector and variance equal to the Identity matrix.
Okay, I'm sorry. I just didn't know the concept of square root matrix.

For the first question.
Let K denote the covariance matrix of the Gaussian vector X.
$\displaystyle X \sim \mathcal{N}_p(\mu,K) \Rightarrow X-\mu \sim\mathcal{N}_p(0,K)$ (0 is the zero column vector of dimension p)

Now, the key step is explained in the PDF I've attached.

$\displaystyle Y=(K^{1/2})^{-1} (X-\mu)$ has a 0 mean vector and a covariance matrix equal to :
$\displaystyle (K^{1/2})^{-1}K\left((K^{1/2})^{-1}\right)^T$

Now, we know that a covariance matrix is symmetric. I've read in the wikipedia that this implies that a square root matrix of K will then be symmetric. So $\displaystyle K^{1/2}$ is symmetric.
And so is its inverse (that should be some property from algebra lol).

Hence $\displaystyle (K^{1/2})^{-1}=\left((K^{1/2})^{-1}\right)^T$

So the covariance matrix of Y is $\displaystyle (K^{1/2})^{-1}K(K^{1/2})^{-1}$

But we have $\displaystyle K^{1/2}K^{1/2}=K$. Left multiply by $\displaystyle (K^{1/2})^{-1}$ :
$\displaystyle K^{1/2}=(K^{1/2})^{-1}K$

So finally, the covariance matrix of Y is $\displaystyle K^{1/2}(K^{1/2})^{-1}=I\quad\quad\quad \blacksquare$

Third : Sorry, I mistyped! Should be:

3) Finally trying to show sqrt(n) * (Covariance Matrix)^(-1/2) * (SampleMean_vector - mu_vector) is N_p (0,I).

Applied Multivariate Statistical Analysis by Johnson and Wichern

Are you able to help?
The sample mean vector is (let's call it Z) $\displaystyle Z=\frac 1n \sum_{i=1}^n X_i$.

$\displaystyle Z-\mu=\frac 1n\sum_{i=1}^n (X_i-\mu)=\frac 1n\sum_{i=1}^n Z_i$, where the $\displaystyle Z_i$ are independent and follow a normal distribution $\displaystyle \mathcal{N}_p(0,K)$

Now, I must say I'm a bit lost. I need to give it a thought later on...
Second: The book shows shows Z^2 =
(X_vector - mu_vector)' * (Covariance Matrix)^(-1/2)*(Covariance Matrix)^(-1/2)* (X_vector - mu_vector). And then notes it is distributed chi-square.

where Z is (Covariance Matrix)^(-1/2) * (X_vector - mu_vector) N_p (0,I)

My question is how do you know the order to set up the multiplication to get Z^2?
Same problem here. Need to think on that later on...

Maybe you'll be able to answer your questions if, as you said, these questions are completely related.
It is hard for me to use square root matrices since I've known this for 30minutes only So I feel quite uneasy

5. I figured you meant the sample mean there.
This can be found online under Affine transformation in Multivariate normal distribution - Wikipedia, the free encyclopedia

6. Hi MathEagle.

I knew of that general result. Could you help me to see how to use it to see why, for example:

Z = (Covariance Matrix)^(-1/2) * (X_vector - mu_vector) ~ N_p (0,I)

7. Originally Posted by B_Miner
Hi MathEagle.

I knew of that general result. Could you help me to see how to use it to see why, for example:

Z = (Covariance Matrix)^(-1/2) * (X_vector - mu_vector) ~ N_p (0,I)
I showed you why !!!

8. I thought Moo helped you.
Meanwhile she's giving me an earful via the messages.
You should thank her and she should stop barking, or mooing?
My ear is going to fall off.

9. Originally Posted by B_Miner
Hi MathEagle.

I knew of that general result. Could you help me to see how to use it to see why, for example:

Z = (Covariance Matrix)^(-1/2) * (X_vector - mu_vector) ~ N_p (0,I)

(1) Moo showed that this is normal, which you should have known

(2) Next E((X_vector - mu_vector))=0 (vector 0).
So a multiple of it is still zero

E(Covariance Matrix)^(-1/2) * (X_vector - mu_vector) =0

(3) V((Covariance Matrix)^(-1/2) * (X_vector - mu_vector))=I
via the algebra at wikipedia.
You can kill the mean and just show
V((Covariance Matrix)^(-1/2) * (X_vector))=I

(2) and (3) follow even without normality.

10. Sorry Moo. I did not understand your solution, it was beyond me. The wiki piece on affine transformation looksed more to my understanding level.

11. Originally Posted by B_Miner
Sorry Moo. I did not understand your solution, it was beyond me. The wiki piece on affine transformation looksed more to my understanding level.
Then next time, say it !!!!!

It would have been far better to say it rather than completely ignore it !!!!!!!

For your information, I exactly used the formula in the Wikipedia.
Then, I just used operations over matrices.
Point out what you didn't understand. IT IS NOT POSSIBLE TO HELP OTHERWISE

12. For E(Covariance Matrix)^(-1/2) * (X_vector - mu_vector) =0

are you assuming Covariance Matrix)^(-1/2) and (X_vector - mu_vector) are independent so that you can do
E(Covariance Matrix)^(-1/2) ) * E((X_vector - mu_vector))
=E(Covariance Matrix)^(-1/2) ) * 0
?

13. Moo-

Wow. No need for all that. I appreciate your help. I was trying to read the posts while at work and as I was finishing yours I read MathEagles. I did not have time to yet respond to yours. Thanks for your help though.

14. And on that note boys, girls and bovines .... Thread closed (otherwise someone's going to lose an eye).