1. ## Independence rules

Hello,

It is a simple probability question but as it has been too long since I took a statistics course, I am not able to solve the problem. I will be appreciated if you can give an idea for the problem.

I have a joint probability distribution on the variables x and y given the parameters a and b:

$\displaystyle p(x,y|a,b)$

x and y are assumed to be independent. $\displaystyle p(x|a)$ and $\displaystyle p(y|b)$ are multinomial distributions separately. Therefore, a is defined for x and b is defined for y. Therefore the equation can be written as given below (because of the independence between x and y):

$\displaystyle p(x|a,b) p(y|a,b)$

Is it also possible to derive this equation as given below by discarding b for x, and a for y: ?

$\displaystyle p(x|a) p(y|b)$

Because b does not mean anything for x, and a does not mean anything for x.

2. the short answer is yes. Your intuition that irrelevant parameters can be removed is correct.

More formally, for continuous variables (without loss of generality):

$\displaystyle p(x|a) = \int p(x|a,b)p(b) db$

since you have said that p(x|a,b) is does not actually vary with b, it can go outside the integral:

$\displaystyle p(x|a) = p(x|a,b) \int p(b) db$

but the integral of a pdf is always 1:
$\displaystyle p(x|a) = p(x|a,b) \times 1$

$\displaystyle p(x|a) = p(x|a,b)$
which is the result you wanted to show. You can do the same thing for p(y|b,a)

3. Very good proof. Thank you very much!