Hello,

This question is about the number of MCMC samples T required to reach a desired accuracy.

Suppose the mixing time of the Markov Chain is T_1=O(N log(N)),
ie the number of samples required so that for any further t, ||P_t-\pi||< e where P is Markov Chain and \pi is the target.

Suppose that M further samples are required from P to reach the desired accuracy (due to the fact that we sample from P instead of \pi).

Then the overall time complexity (ie, number of samples required) is O(M N log(N))

I don't understand the last part. Shouldn't it be O(M + N log(N)) ?