A and B are random variables from the same type of distribution. a and b are constants. C = aA + bB corr(A, C) = a/sqrt(a^2 + b^2) corr(B, C) = b/sqrt(a^2 + b^2) I'm stuck and cannot prove it. I would appreciate some help. Thanks!
Follow Math Help Forum on Facebook and Google+
View Tag Cloud