i suspect not many people understand what you wrote. Or is that just me.
No one answers my questions, but I figure I might as well try.
In solving a problem I've stumbled across something that I'd like to verify. Suppose and is a symmetric matrix. Then, for any function the following holds:
The idea behind this is to apply the spectral theorem to write and define . Then, and .
From there, argue
(The are the eigenvalues of if that isn't clear). I guess the main thing I need to get this off the ground is that are identically distributed for all i. This seems true since I think is identically distributed for all i. This justifies the steps where I do stuff with the expected values.
Something is unclear? It's all completely standard matrix notation...tr(Q) is the trace of Q, X is a random vector of length p with the specified variance covariance matrix, Y is an appropriate orthogonal transformation of X, is the spectral decomposition of ...
I mean, it's not intro level math-stat, but I don't think it's unreasonable for someone in statistics to read it and understand the notation. Maybe I'll stick with visiting my professor's office hours instead of hoping for faster answers here.