# Thread: Maximum Log-Likelihoods

1. ## Maximum Log-Likelihoods

Suppose r successes were obtained in a sequence of m Bernoulli trials with success probability $\displaystyle {\Theta}_1$, and that in an independent second sequence of n Bernoulli trials with success probability $\displaystyle {\Theta}_2$, there were s successes.

Show that the maximum of the log-likelihood of all obervations is

$\displaystyle In{\binom{m}{r}} + In{\binom{n}{s}} + rIn(r) + sIn(s) + (m-r)In(m-r) +$$\displaystyle (n-s)In(n-s) - mIn(m) - nIn(n)$.

Is there a formula for this and its then just a case of plugging in the variables?

2. Ive sort of advanced a little on this...

Since the two trials are independent, the total max log-likelihood is the sum of the two individual trials i.e.

Trial 1; max log-likelihood is $\displaystyle In \binom{m}{r} + rIn(r) + (m-r)In(m-r) - mIn(m)$

Trial 2; max log-likelihood is $\displaystyle In \binom{n}{s} + sIn(s) + (n-s)In(n-s) - nIn(n)$

Adding these together gives total max log-likelihood.

Is the max log-likelihood for each individual trial found by doing something like finding the derivative of the log-likelihood and setting it to 0? I'm mostly unsure as to where the binomial expression comes from...

3. Gotten further now...

$\displaystyle In(L) = In(L({\Theta}_1 | m,r))$

$\displaystyle = In \binom{m}{r} + rIn({\Theta}_1) + (1 - {\Theta}_1)In(m-r)$...