# Chi-squaredk distribution proof

• Aug 24th 2009, 08:15 AM
ynotidas
Chi-squaredk distribution proof
The information I have been given is:

If X1,...,Xk are independent N(0,1) random variables, then
W = (X1)^2 + (X2)^2 + ... + (Xk)^2
has a Chi-squaredk distribution, where k is the degrees of freedom (number of independent squares in the sum of W)

Show that:
If Z1 ~ Gamma (alpha1, beta) and Z2 ~ Gamma (alpha2, beta); Z1 and Z2 are independent, then Z = Z1 + Z2 ~ Gamma(alpha1+alpha2, beta).

Hence show that W ~ Gamma (k/1, 1/2)

(on a sidenote, can you tell me or point me in the direction on how to type in the mathematical format? like a website or page to tell me to type in mathematical formulas, instead of the way i do it now, which is just in word format)

Thanks!
Ynotidas
• Aug 24th 2009, 10:13 AM
Moo
Hello,

for the format, look here : http://www.mathhelpforum.com/math-help/latex-help/, the threads in the upper part...

Quote:

If X1,...,Xk are independent N(0,1) random variables, then
W = (X1)^2 + (X2)^2 + ... + (Xk)^2
has a Chi-squaredk distribution, where k is the degrees of freedom (number of independent squares in the sum of W)
find the distribution of $\displaystyle X_1^2$

the pdf of N(0,1) is $\displaystyle \frac{1}{\sqrt{2\pi}} \cdot e^{-t^2/2}$, for t in $\displaystyle \mathbb{R}$

now this is a method (law of the unconscious statistician) i can't stop to do. otherwise (it's very similar) you just have to use the jacobian transformation of the pdf. (a search in this subforum may give you some results)
remember that the jacobian transformation has to be a diffeomorphism (in other words a bijection)

for any measurable function h,
$\displaystyle \mathbb{E}(h(X^2))=\int_{\mathbb{R}} h(x^2) \cdot \frac{1}{\sqrt{2\pi}} \cdot e^{-x^2/2} ~dx$

if you make right now the jacobian transformation y=x², you'll have a problem since it's not a bijection.
so just note that the integrand is an even function.

hence $\displaystyle \mathbb{E}(h(X^2))=2\int_0^\infty h(x^2) \cdot \frac{1}{\sqrt{2\pi}} \cdot e^{-x^2/2} ~dx$

now make the transformation y=x². this gives $\displaystyle dy=2x ~ dx=2 \sqrt{y} ~dx$

--> $\displaystyle \mathbb{E}(h(X^2))=\int_0^\infty h(y) \cdot\frac{1}{\sqrt{2\pi}} \cdot e^{-y/2} \cdot y^{-1/2} ~dy$ (that's simple calculus & manipulations)

so the pdf of $\displaystyle Y=X^2$ is $\displaystyle \frac{1}{\sqrt{2\pi}} \cdot e^{-y/2} \cdot y^{-1/2}$, which is the pdf of a Gamma distribution $\displaystyle \gamma \left(\tfrac 12, \tfrac 12\right)$
(there are several conventions for the parameters of a gamma distribution, so it may be (2,1/2))

Quote:

If Z1 ~ Gamma (alpha1, beta) and Z2 ~ Gamma (alpha2, beta); Z1 and Z2 are independent, then Z = Z1 + Z2 ~ Gamma(alpha1+alpha2, beta).
for here, the most straigthforward method is to use the mgf of a gamma distribution.
and to remember that the mgf of the sum is the product of the mgf.

then use this last part to conclude that W follows a chi square distribution, by noting that a chi square distribution is a gamma distribution