Math Help - Maximum likelihood

1. Maximum likelihood

How do I find the mle for r1 and r2 if:

p(x1,x2) = 1 - r1^x1 * r2^x2 ,

r1 <1, r2 < 1

given that x1 and x2 can be either 1 or 0 (i.e. it can be either (0,0), (0,1), (1,1) or (1,0) ?

Any help will be appreciated.

2. I don't understand this distribution.
Are you saying that...
$P(X_1=x_1,X_2=x_2)=1-r_1^{x_1}r_2^{x_2}$
then...
$P(X_1=0,X_2=0)=0$
$P(X_1=1,X_2=0)=1-r_1$
$P(X_1=0,X_2=1)=1-r_2$
and
$P(X_1=1,X_2=1)=1-r_1r_2$
which doesn't sum to one.
I'm lost.

3. Here is the full explanation:

There are two risk factors that may cause an accident to happen. A risk factor $x_i$ is either present (x = 1) or absent (x = 0).

We have a function that specifies probability p(x1, x2) of an accident, but not the outcome in every concrete case.

We assume that each risk factor $x_i$ reduces the probability of a good outcome (no accident) by some factor $r_i < 1$. In other words, the unknown function
p is of the form $p(x_1, x_2) = 1 - r_1^{x_1} r_2^{x_2}$, so we have to learn parameters $r_1, r_2$.

How would you compute the ML hypothesis from the data, under the given assumption on p?

4. Once again
$P(X_1=0,X_2=0)=0$
$P(X_1=1,X_2=0)=1-r_1$
$P(X_1=0,X_2=1)=1-r_2$
and
$P(X_1=1,X_2=1)=1-r_1r_2$
doesn't sum to one.
Something is off here.
I can differentiate and estimate the two r's, but that seems to be meaningless here.