1. ## Covariance Question

Hi all,

Would just like to confirm something and hopefully hear some of your thoughts.

Cov(\sum_{i=1}^{N_1}X_i , \sum_{i=1}^{N_2}X_i )

Here X_i are iid random variables of distribution X~Gamma(a,b) and N_1 and N_2 are dependent negative binomials.

Would it still be feasible to conclude that the covariance is 0 due to the independence of X_i (even though the N_i are dependent)?

Cheers

2. ## Re: Covariance Question

Originally Posted by casio415
Hi all,

Would just like to confirm something and hopefully hear some of your thoughts.

$\displaystyle Cov(\sum_{i=1}^{N_1}X_i , \sum_{i=1}^{N_2}X_i )$

Here X_i are iid random variables of distribution X~Gamma(a,b) and N_1 and N_2 are dependent negative binomials.

Would it still be feasible to conclude that the covariance is 0 due to the independence of X_i (even though the N_i are dependent)?

Cheers
$\displaystyle Cov\left(\displaystyle{\sum_{i=1}^{N_1}}X_i , \displaystyle{\sum_{j=1}^{N_2}}X_j\right) = \displaystyle{\sum_{i=1}^{N_1}} \displaystyle{\sum_{j=1}^{N_2}}Cov(X_i,X_j) =\begin{array}{ll}&0 : i \neq j \\ &N_1 N_2 Var(X_i) : i=j\end{array}$

so the off diagonal terms of your covariance matrix are still 0 but the diagonal elements do depend on $\displaystyle N_1\mbox{ and }N_2$

3. ## Re: Covariance Question

Thanks for the reply. One thing though - the LHS covariance presumably is a known constant whereas the RHS is in terms of N_1 N_2 which are random variables. How can this is reconciled?

4. ## Re: Covariance Question

the way to get rid of this dependence is to scale the sum by $\displaystyle \frac{1}{\sqrt{N_1 N_2}}$