This question is about the number of MCMC samples T required to reach a desired accuracy.

Suppose the mixing time of the Markov Chain is $\displaystyle T_1=O(N log(N))$,
ie the number of samples required so that for any further t, $\displaystyle ||P_t-\pi||< e$ where P is Markov Chain and $\displaystyle \pi$ is the target.

Suppose that M further samples are required from P to reach the desired accuracy (due to the fact that we sample from P instead of $\displaystyle \pi$).

Then the overall time complexity (ie, number of samples required) is $\displaystyle O(M N log(N)) $

I don't understand the last part. Shouldn't it be $\displaystyle O(M + N log(N)) $?