## simplified Bayes for many variables‏

Dear Help Forum

I have looked at the case where I have 3 events A,B,C which are partially dependent.
I define the following probabilities as inputs:

P(A), P(B), P(C), P(B|A), P(C|B), P(C|A), P(C|A,B)

(I could choose other ones). From these I apply Bayes' theorem and total probability theorem, to calculate the probability of all possible 16 outcomes in an Excel spreadsheet:

P(~A|B,C), P(B|A,C), P(~B|A,C), P(A|B,~C), P( ~A|B,~C), P(B|A,~C), P( ~B|A,~C), P(C|~A,B), P(~C|~A,B), P(B|~A,C), P(~B|~A,C)

P(C|A,~B), P(~C|A,~B), P(A|~B,C), P(~A|~B,C), P(B|~A,~C), P~B|~A,~C), P(A|~B,~C), P(~A|~B,~C), P(C|~A,~B), P(~C|~A,~B)

Now I would like to proceed to more (many), say N, events. Obviously it becomes too painful to define all the necessary [(2**N)-1] inputs and
calculate all the outcomes exhaustively in the same manner.

I need to simplify the problem by invoking conditional independence.
For example, in the case of 4 events I envisage defining only the following inputs:

P(A), P(B), P(C), P(D), P(B|A), P(C|A,B) , P(D|A,B,C)

and making the assumption that other, 'cross-dependencies' , are redundant,

for example

P( D |A ,C ) = P( D ) .

Not sure how I proceed from here and apply Bayes' Theorem to calculate the probabilities of all the output scenarios
e.g. P(C|A,~B,~D) etc etc May be I have to employ a numerical approach using binomial distribution?
I read something about "Naive Bayes' Theorem" and "Empirical Bayes' Theorem" - not sure if they are relevant.
Or do I need to employ a Bayesian Network approach ?

Thanks a lot in advance for your time to read this and any responses would be greatly appreciated.

Regards

Tim