You will have to use a change of variables in your integral. Have you come across the substitution theorem for multivariable integration?
Hi, I need your help with this problem: Suppose (X, Y)' follows a Bivariate Normal Distribution with parameters μ1 ,μ2, σ1^2, σ2^2, and ρ. Let U = X + Y and V = X - Y. Considering that X and Y are not independent random variables, how will I get the joint distribution of U and V? Thanks in advance!
Well, I believe you can use their linear combination theorem since they are assumed bivariate normal, that is when we have the scalar product
U=AX+BY=(A B)'(X Y). In this case, we need A=1, B=1 to have U=X+Y
The mean of U is thus given by
The variance of U is given by
=(1 1) (1 1)'
where is the covariance matrix of X and Y.
Note that the variance matrix Sigma is not just a diagonal matrix like in the independence case. Here, the covariance elements are assumed to be non zero.
Therefore, we obtain (as expected)
And since they are bivariate normal by assumption, we obtain
Similar for V. In order to get V, replace (1 1) by (1 -1) as then you have V=X-Y. Then redo the matrix calculations for the joint mean and variance.