Independence rules

• Feb 24th 2011, 04:09 AM
malaguena
Independence rules
Hello,

It is a simple probability question but as it has been too long since I took a statistics course, I am not able to solve the problem. I will be appreciated if you can give an idea for the problem.

I have a joint probability distribution on the variables x and y given the parameters a and b:

$p(x,y|a,b)$

x and y are assumed to be independent. $p(x|a)$ and $p(y|b)$ are multinomial distributions separately. Therefore, a is defined for x and b is defined for y. Therefore the equation can be written as given below (because of the independence between x and y):

$p(x|a,b) p(y|a,b)$

Is it also possible to derive this equation as given below by discarding b for x, and a for y: ?

$p(x|a) p(y|b)$

Because b does not mean anything for x, and a does not mean anything for x.

• Feb 25th 2011, 01:12 PM
SpringFan25
the short answer is yes. Your intuition that irrelevant parameters can be removed is correct.

More formally, for continuous variables (without loss of generality):

$p(x|a) = \int p(x|a,b)p(b) db$

since you have said that p(x|a,b) is does not actually vary with b, it can go outside the integral:

$p(x|a) = p(x|a,b) \int p(b) db$

but the integral of a pdf is always 1:
$p(x|a) = p(x|a,b) \times 1$

$p(x|a) = p(x|a,b)$
which is the result you wanted to show. You can do the same thing for p(y|b,a)
• Feb 26th 2011, 02:22 PM
malaguena
Very good proof. Thank you very much!