I have an analysis question that's bugging me. In an old qualifying exam, there was the following question:

Let $\displaystyle f_j$ be an orthonormal sequence in $\displaystyle L^{2}([0,1])$. Prove that $\displaystyle S_{n}=\frac{1}{n}\sum_{j=1}^{n} f_{j}$ converges to zero a.e.

Now, I know how to show that $\displaystyle S_{n}$ converges to 0 in $\displaystyle L^{2}$ norm (and hence there's at least a subsequence that converges to zero a.e. and also it converges in measure).

However, I have a vague memory that when I last looked at this problem, there was a counterexample to the question as stated. Can anyone tell me at least whether or not a counterexample exists?

When I tried proving the problem as given, I couldn't think of an approach except for showing that the integral of the limit of the absolute values of the $\displaystyle S_{n}$'s was 0, but that seems like a dead end, since I'm doubtful I could get dominated convergence to work. When I tried making a counterexample, it didn't seem like I had a lot of freedom to do anything interesting. Averages of initial segments of orthonormal sets all seem to look the same.

Help?