This is because for iid random variables , the random variables and are independent ("the empirical mean and variance of independent gaussian r.v. are independent").

I only rewrote what you said, but the point is that this is very specific to Gaussian random variables, and you probably have it in your course notes.

I don't have a simple intuitive way to think about it (this can be understood geometrically thanks to Cochran's theorem using orthogonal projections of Gaussian random vectors, but this doesn't make it much more intuitive; anyway I can develop if you wish). As for a proof, it requires knowledge about Gaussian random vectors; the idea is that the covariance of and is found to be zero, and that for Gaussian vectors this implies the independence; again I can develop if you're really interested, but you may have seen it in your course.

The answer is yes. It is probably in your textbook as well, but I don't know the English name, it is "indépendance par paquets" in French, like "independence by packets" or "packets independence" perhaps?Question 2) If X1,X2,...,X6 are independent random varaibles, then my textbook has a theorem saying that g1(X1), g2(X2), ..., g6(X6) are also independent, where the gi's are any function of a single random variable.

But how about f1(X1,X2,...,X5), f2(X6)? If X1,X2,...,X6 are independent, are any function of X1,...,X5 and any function of X6 independent?