I'm rephrasing it
$\displaystyle \forall \epsilon>0,{\color{red}\int (f_n-f)^2 ~d\mu} \geq \int_{\{|f_n-f|>\epsilon\}} (f_n-f)^2 ~d\mu>{\color{blue}\epsilon^2 \mu(|f_n-f|>\epsilon)}\geq 0
$
Since $\displaystyle f_n$ converges to $\displaystyle f$ in Lē, then by definition of the convergence in Lē, the
red part goes to 0 as
n goes to infinity
Thus by the sandwich theorem, the
blue part goes to 0 as n goes to infinity.
When it is said "for all epsilon", it's like you choose any epsilon and you
fix it. So it's like a constant !
Thus $\displaystyle \lim_{n\to\infty} {\color{blue} \mu(|f_n-f|>\epsilon)}=0$, which means, by definition, that $\displaystyle f_n$ converges to $\displaystyle f$ in measure.
Looks better ?