(n is the size of the matrix and we are dealing with random matrices with entries taken from the normal distribution).
'As n tends to infinity, the kth moment of the mean eigenvalue distribuition of the matrix tends to the kth moment of the standard semicircle law. By the method of moments, this shows that the mean eigenvalue distribution tends to the standard semicircle law as n gets larger.'
Firstly, is this correct and secondly, how is the method of moments used here. I thought it only dealt with a sequences of distributions...?
I haven't done any probability theory for years so I apologise if this is a simple question.