# Thread: Vector of independent normal RV's - Very difficult for me

1. ## Vector of independent normal RV's - Very difficult for me

Let $X = (X_1,...,X_n)^T$ be a column vector of independent N(0,1) distributed random variables. Let A be an $n \times n$ orthogonal matrix.

(1) Consider the random column vector $Y = (Y_1,..., Y_n)^T$, obtained via the linear transformation $Y = AX$. Show that $Y$ is again a vector of independent N(0,1) distributed random variables.

(2) Let A be an orthogonal matrix whose last row is $(1....,1)/ \sqrt{n}$. Consider again the linear transformation $Y = AX$. Show that

$W = \sum_{i=1}^n (X_i - \bar{X})^2 = \sum_{i=1}^n X_i^2 - n \bar{X^2} = \sum_{i=1}^n Y_i^2 \sim \mathcal{X}_{n-1}^2$

(3) Let $Z_1, ..., Z_n$ be independent N $( \mu, \sigma^2)$ random variables, with sample variance $S^2 = \frac{1}{n-1} \sum_{i=1}^n (Z_i - \bar{Z})^2$. Show using (1) - (2) that $(n-1)S^2 / \sigma^2 \sim \mathcal{X}_{n-1}^2$

2. Originally Posted by RoyalFlush
Let $X = (X_1,...,X_n)^T$ be a column vector of independent N(0,1) distributed random variables. Let A be an $n \times n$ orthogonal matrix.

(1) Consider the random column vector $Y = (Y_1,..., Y_n)^T$, obtained via the linear transformation $Y = AX$. Show that $Y$ is again a vector of independent N(0,1) distributed random variables.
For this it is sufficient to show that the covariance matrix of $Y$ is diagonal (and you can do that by considering what the $i,j$-th off-diagonal element must be).

You will need to know some properties of orthogonal matrices, and how to form the covariance matrix of $Y=AX$, form $A$ and the covariance matrix of $X$.

CB

3. Okay, I have been able to work out question 1. But I am really stuck on question 2 and consequently question 3.