Hi! I'm trying to find the MLE-estimator for a logit model, According to a paper I'm reading, the following is true:

L = $\displaystyle \prod_{i=1}^{N} (\frac{exp(\beta_0 +\beta _1X_i )}{1 + exp(\beta_0 +\beta _1X_i) })^{Y_i}(\frac{1}{1 + exp(\beta_0 +\beta _1X_i })^{1-Y_i}$

=

$\displaystyle

[\prod_{i=1}^{N}(\frac{1}{1 + exp(\beta_0 +\beta _1X_i) })] exp(\beta _0\sum_{i=1}^{N} Y_i + \beta _1\sum_{i=1}^{N}Y_iX_i)$

and then you go on with taking the log, calculating the first derivate and then maximizing.

But what I don't understand is how and why you can simplify the first expression into the second. Can someone please help explaining?

My suggestion was to directly take the log of the first expression and receive:

ln L = $\displaystyle \sum_{i=1}^{N}y_i(ln[\frac{exp(\beta_0 +\beta _1X_i)}{1 + exp (\beta_0 +\beta _1X_i)}]) + \sum_{i=1}^{N}(1-y_i)ln[\frac{1}{1 + exp(\beta_0 +\beta _1X_i)}]$

but maybe that's not the right way to do it?

Thanks for your help.