Thread: How can I combine these two likelihoods?

1. How can I combine these two likelihoods?

Consider an unknown number of new-born animal cubs, B, which can be split into two groups - those born from subadult animals, B1 and those born from adult animals, B2, where B = B1+B2. We sample these cubs, with each of the two groups having their own individual-level probability of being selected, p1 and p2, respectively. The number of cubs sampled can be denoted as b1 and b2, respectively.

I want to build a likelihood for the total number of cubs, B, given that I know p1, p2, b1, and b2, to be used in a larger joint likelihood. If I were interested in building a likelihood for the two different groups of cubs, I could write:

L(B1 | p1,b1) = [B1 choose b1] * (p1)^{b1} * (1 - p1)^{B1 - b1}
L(B2 | p2,b2) = [B2 choose b2] * (p2)^{b2} * (1 - p2)^{B2 - b2}

What I want, however, is

L(B | p1,b1,p2,b2) = ?

The full likelihood already has a large number of parameters to estimate, so I'd rather just have a single parameter for the number of new-born cubs each year, as opposed to having two parameters for each year - new-born cubs with sub-adult parents and new-born cubs with adult parents.

Any help would be greatly appreciated! A better-formatted version of the equations can be found in this thread: https://stats.stackexchange.com/ques...wo-likelihoods

2. Re: How can I combine these two likelihoods?

Hey Bergatory.

That likelihood is a conditional probability except you need a joint distribution.

Remember that P(A|B) = P(A and B)/P(B) but in this case the P(B) part is P(p1 and b1 and p2 and b2).

If things are independent then you can simplify P(A and B) = P(A)P(B) but if not you need a formula that combines both and you can't separate the two with multiplication easily.

Show us what you have tried to do to get that likelihood and we can look at it further.

You should look at a single event of choosing the cubs and see how the probabilities of the next choice are impacted.

If you use standard sampling without replacement then you will use what is called a hypergeometric distribution.

https://en.wikipedia.org/wiki/Hyperg...c_distribution

I'd recommend though you do a first principles approach and look at each selection bit by bit.

3. Re: How can I combine these two likelihoods?

Hi chiro,

The two events of choosing individuals are independent of each other (knowing how many cubs of sub-adults were selected does not change how many cubs of adults get selected). However, I essentially need to make them dependent to avoid estimating B1 and B2 separately.

My initial attempts are as follows:

L(B1, B2 | p1,b1, p2,b2) = [B1 choose b1] * (p1)^{b1} * (1 - p1)^{B1 - b1} * [B2 choose b2] * (p2)^{b2} * (1 - p2)^{B2 - b2}

L(B, B2 | p1,b1, p2,b2) = [(B - B2) choose b1] * (p1)^{b1} * (1 - p1)^{(B - B2) - b1} * [B2 choose b2] * (p2)^{b2} * (1 - p2)^{B2 - b2}

The problem is that I can't seem to eliminate the presence of both B1 and B2 from the equations simultaneously. Basically, I only want B, b1, b2, p1, and p2 in the equation.

4. Re: How can I combine these two likelihoods?

If you can't do it analytically, you will probably have to set up a numerical scheme to converge to the answer.

Also - I'd recommend looking at MCMC techniques for estimating these kinds of distributions. They allow you to simulate random variables so that you can get really complicated distributions like you have and obtain the final distribution and moment information.

Do you need to get an analytic answer or do you need to just get some estimates?

5. Re: How can I combine these two likelihoods?

I am trying to find a maximum likelihood estimate for N.

6. Re: How can I combine these two likelihoods?

What is N? How do you define it?

7. Re: How can I combine these two likelihoods?

Sorry, I mean a maximum likelihood estimate for B, which is defined as B1 + B

8. Re: How can I combine these two likelihoods?

Are they independent? If so you can use the convolution theorem.

If not you will have to express the relationship.

Also - P(A and B) = P(B|A)P(A) which means you can use this to get the joint likelihood function.

Once you have this, then you maximize over this to find the solution for the joint likelihood and you use the MLE invariance theorem and apply to the B1 + B2 case [but multivariable calculus will be involved here].

9. Re: How can I combine these two likelihoods?

Hi chiro,

Thanks for the info. I'm not too familiar with the convolution theorem, but wouldn't either approach still require B1 and B2 to be estimated by the MLE?

Also, the events are, for the purpose of the analysis, independent.

10. Re: How can I combine these two likelihoods?

You estimate it by maximizing the likelihood for the distribution that corresponds to B1 + B2 [which has a distribution].