I am following through the equations on this page:

MLE (Maximum Likelihood) Parameter Estimation

First of all I am trying to understand the concept of the maximum likelihood approach.

I assume that you take some empirical measurements of a particular variable, which are samples from a population, and assume that these show a normal distribution. Does maximum likelihood tweak these estimations in some way to get a general relationship?

I have got to the second equation down, where it has the pipe and the product sign. At the end there is the format  (x_i; \theta_1, \theta_2 ..._k, \theta_3) . What does the ; mean? The product of all x with all theta? (i.e.  (x_1\theta_1, x_2\theta_2) , etc.)

If so, you know x, but you don't know theta. I assume this is what you have to estimate by maximising L or  \Lambda . If so, what is Lambda?

I guess I'm still struggling with the concept