Originally Posted by

**Aryth** Suppose the function f(x) is defined on the interval (-L,L) and $\displaystyle \int_{-L}^L f^2(x)~dx$ is finite.

We want to approximate f(x) by a finite series of (co-)sines:

$\displaystyle f(x) \approx g_N(x) = A_0 + \sum_{n=1}^N A_n\cos{\left(\frac{n\pi}{L}x\right)} + B_n\sin{\left(\frac{n\pi}{L}x\right)}$

where parameters $\displaystyle A_0$, $\displaystyle A_n$, and $\displaystyle B_n$ are freely variable.

The mean square error of the approximation is defined as:

$\displaystyle E_N = \int_{-L}^L (f(x) - g_N(x))^2~dx$

a) Show that the choice: $\displaystyle A_0 = a_0$, $\displaystyle A_n = a_n$ and $\displaystyle B_n = b_n$, where $\displaystyle a_0$, $\displaystyle a_n$, and $\displaystyle b_n$ are the Fourier coefficients of f, minimizes $\displaystyle E_N$

I think I can figure the rest of it out when I get this, so I won't bother typing it... I have never worked with the mean error before so I'm a bit confused as to how to start. Any help would be appreciated.