I have an analysis question that's bugging me. In an old qualifying exam, there was the following question:

Let be an orthonormal sequence in . Prove that converges to zero a.e.

Now, I know how to show that converges to 0 in norm (and hence there's at least a subsequence that converges to zero a.e. and also it converges in measure).

However, I have a vague memory that when I last looked at this problem, there was a counterexample to the question as stated. Can anyone tell me at least whether or not a counterexample exists?

When I tried proving the problem as given, I couldn't think of an approach except for showing that the integral of the limit of the absolute values of the 's was 0, but that seems like a dead end, since I'm doubtful I could get dominated convergence to work. When I tried making a counterexample, it didn't seem like I had a lot of freedom to do anything interesting. Averages of initial segments of orthonormal sets all seem to look the same.

Help?