
Independence rules
Hello,
It is a simple probability question but as it has been too long since I took a statistics course, I am not able to solve the problem. I will be appreciated if you can give an idea for the problem.
I have a joint probability distribution on the variables x and y given the parameters a and b:
$\displaystyle p(x,ya,b)$
x and y are assumed to be independent. $\displaystyle p(xa)$ and $\displaystyle p(yb)$ are multinomial distributions separately. Therefore, a is defined for x and b is defined for y. Therefore the equation can be written as given below (because of the independence between x and y):
$\displaystyle p(xa,b) p(ya,b)$
Is it also possible to derive this equation as given below by discarding b for x, and a for y: ?
$\displaystyle p(xa) p(yb)$
Because b does not mean anything for x, and a does not mean anything for x.
Thanks a lot in advance!

the short answer is yes. Your intuition that irrelevant parameters can be removed is correct.
More formally, for continuous variables (without loss of generality):
$\displaystyle p(xa) = \int p(xa,b)p(b) db$
since you have said that p(xa,b) is does not actually vary with b, it can go outside the integral:
$\displaystyle p(xa) = p(xa,b) \int p(b) db$
but the integral of a pdf is always 1:
$\displaystyle p(xa) = p(xa,b) \times 1 $
$\displaystyle p(xa) = p(xa,b) $
which is the result you wanted to show. You can do the same thing for p(yb,a)

Very good proof. Thank you very much!