# Thread: Numerical Analysis - finding least squares approximations

1. ## Numerical Analysis - finding least squares approximations

Hi,

I have to find a least squares approximations to the functions cos(x) and exp(x), each on the interval [0,1].

How do I do this??!

2. Originally Posted by lifeonmars
Hi,

I have to find a least squares approximations to the functions cos(x) and exp(x), each on the interval [0,1].

How do I do this??!
Let us concetrate on y=cos x
Are you asking me to do it out for you?
It takes a long time.

You want to find y=mx+b which minimizes the area between the curve cos x and mx+b.

3. Originally Posted by lifeonmars
Hi,

I have to find a least squares approximations to the functions cos(x) and exp(x), each on the interval [0,1].

How do I do this??!
We will need more information. Primarily what type of functions do you want
to fit cos(x) and exp(x) with, and what is the maximum order of basis
function do you want to use?

For instance do you want a least squares linear approximation, quadratic,
cubic, fourier sin series, cosin series, gaussian sum,...

RonL

4. Argh, sorry. I deleted the bit of my post with the pertinent extra information in it. Quadratic least squares!

Thanks!

5. For y=cos x on [0,1]
y=-.431x^2-.035x+1.00 and R^2=.999

For y=e^x on [0,1]
y=.836^2+.855x+1.011 and R^2=.999

This is mine 47th Post!!!

Take the cosx example - I've written down three equations:

(int stands for integral, from 0 to 1)

a0*int[(1.dx)]+a1*int[x.dx]+a2*int[x^2.dx]=int[cos(x).dx]
a0*int[(x.dx)]+a1*int[x^2.dx]+a2*int[x^3.dx]=int[x.cos(x).dx]
a0*int[(x^2.dx)]+a1*int[x^3.dx]+a2*int[x^4.dx]=int[x^2.cos(x).dx]

and then performed the integrations, done some simultaneous equations and come out with values for a0, a1, and a2; these are the co-efficients for the x^0, x^1 and x^2 terms respectively.

Is this the right method?

Thanks.

7. Originally Posted by lifeonmars

Take the cosx example - I've written down three equations:

(int stands for integral, from 0 to 1)

a0*int[(1.dx)]+a1*int[x.dx]+a2*int[x^2.dx]=int[cos(x).dx]
a0*int[(x.dx)]+a1*int[x^2.dx]+a2*int[x^3.dx]=int[x.cos(x).dx]
a0*int[(x^2.dx)]+a1*int[x^3.dx]+a2*int[x^4.dx]=int[x^2.cos(x).dx]

and then performed the integrations, done some simultaneous equations and come out with values for a0, a1, and a2; these are the co-efficients for the x^0, x^1 and x^2 terms respectively.

Is this the right method?

Thanks.
No. There are many ways of doing this which you use will depend on
what you have been covering in the course that this is a question from.

The most elegant is to use orthonormal polynomials on (0,1), which we may
call P_0(x), P_1(x), and P_3(x), then the coefficients of these polynomials in
the expansion of cos(x) are:

a_0 = integral P_0(x) cos(x) dx x=0,1

a_1 = integral P_1(x) cos(x) dx x=0,1

a_2 = integral P_2(x) cos(x) dx x=0,1

Then the required quadratic polynomial is:

a_0 * P_0(x) + a_1 * P_1(x) + a_2 * P_2(x).

(Note P_0(x) is a constant polynomial the x is just for consistency with
the others).

If you don't know the first three orthonormal polynomials on (0,1) you
can find them by applying the Gramm-Schmit process to: 1, x, x^2.

RonL