# Gamma distribution question with n independent variables

• Oct 1st 2008, 10:44 PM
lllll
Gamma distribution question with n independent variables
I need a some help with the following statement:

Let $\displaystyle Y_1,Y_2,...,Y_n$ be independent r.v., where $\displaystyle Y_i$ has a gamma distribution, with parameters $\displaystyle \alpha_i$ and $\displaystyle \beta$ (where $\displaystyle \beta$ is fixed and the values of $\displaystyle \alpha$ are different). Prove that $\displaystyle U=Y_1+Y_2+...+Y_n$ has a gamma distribution with parameters $\displaystyle \alpha_1+\alpha_2+...+\alpha_n$ and $\displaystyle \beta$

so far I have:

$\displaystyle U=Y_1+Y_2+...+Y_n$

$\displaystyle m_U(t) = m_{Y_1}(t)\times m_{Y_2}(t)\times ... \times m_{Y_n}(t)$

$\displaystyle m_U(t) = (1-\beta)^{-\alpha_1} \times (1-\beta)^{-\alpha_2} \times ... \times (1-\beta)^{-\alpha_n}$

$\displaystyle m_U(t)=\prod^{n}_{i=1} (1-\beta)^{-\alpha_i}= \$

at which point I don't know how to continue it.
• Oct 1st 2008, 11:40 PM
Laurent
Why not write: $\displaystyle m_U(t)=\prod^{n}_{i=1} (1-\beta)^{-\alpha_i}= (1-\beta)^{-\sum_{i=1}^n\alpha_i}$ ?

(By the way, it is not straightforward to show that the moment generating function of a Gamma distribution is $\displaystyle (1-\beta)^{-\alpha}$, and the most elementary proof of your statement is by change of variable in multiple integrals. But if you were given the moment generating function, you're definitely right using it, since this is way quicker.)