Assuming Xbar is the sum of all Xi I think u'll need these relations. var(Xi) = sig^2:
p = corr(Xi,Xj) = cov(Xi,Xj)/[sqrt(var(Xi)var(Xj)]
cov(Xi,Xj) = E(XiXj) - E(Xi)E(Xj)
Var(Xi+Xj) = Var(Xi) + Var(Xj) + Cov(Xi,Xj)
Hi I have this question.
Suppose that the random variables are identically distributed, with mean and variance , but not independent. Assume the correlation between any pair is equal to . i.e. for .
1) Derive for this situation.
2) What is when \rho=0? Explain.
3) What is when \rho=1? Explain.
4) Use the rsult you have derived to comment on how small can be in this situation. Explain.
To me, this looks hard. I can do it for the case if the random variables are independent and identically distributed (because it is such a well-known result).
But how do you do it if the random variables are not independent? I'm a bit confused. Can someone lend me a hand and direct me to a suitable source for this question?
Thanks!!
Regards,
Lpd
Assuming Xbar is the sum of all Xi I think u'll need these relations. var(Xi) = sig^2:
p = corr(Xi,Xj) = cov(Xi,Xj)/[sqrt(var(Xi)var(Xj)]
cov(Xi,Xj) = E(XiXj) - E(Xi)E(Xj)
Var(Xi+Xj) = Var(Xi) + Var(Xj) + Cov(Xi,Xj)
For correlated variables, you can use the fact that the variance of the sum of correlated variables is equal to the sum of the covariances.
reading this link http://sites.stat.psu.edu/~dhunter/a...ures/asymp.pdf section 2.2.3 on page 54 might help
Variance - Wikipedia, the free encyclopedia
means the variables are independent and the variance will be given by