Finding eigenvectors by numerical methods

I have a system of matrices to solve for a,b,c,d

$\displaystyle \begin{Bmatrix}2 & 3 & 5 & 2 & 1 \\4 & 3 & 1 &11 & 3 \\ 4 & 8 & 2 & 5 & 4\end{Bmatrix}$. $\displaystyle \begin{Bmatrix}1 \\a\\b\\c\\d\end{Bmatrix}=$ $\displaystyle \begin{Bmatrix}0 \\0\\0\\0\\0\end{Bmatrix}$

I have to use numerical methods NOT analytical methods, that means I can't just multiply it out and solve for a,b,c,d simultaneously. An numerical method would be something like making an original guess for a,b,c,d and using an algorithm to find better estimates and repeating the algorithm until my estimates are accurate much like the Newton-Raphson iteration does. In reality the matrices are 18x20 and 1x18 but I scaled down the problem to be more manageable.

Re: Finding eigenvectors by numerical methods

Your fundamental problem is that you have only three equations but four values to determine. There will be an infinite number of correct solutions. A "numerical method" might give you **one** solution but can't give all.

Re: Finding eigenvectors by numerical methods

Ok well I think the problem was designed to force us to not use analytical solutions. So do you know of any algorithm to find one solution?

Re: Finding eigenvectors by numerical methods

I don't think you can do it for a non-square matrix.

Re: Finding eigenvectors by numerical methods

Quote:

Originally Posted by

**Shakarri** I have a system of matrices to solve for a,b,c,d

$\displaystyle \begin{Bmatrix}2 & 3 & 5 & 2 & 1 \\4 & 3 & 1 &11 & 3 \\ 4 & 8 & 2 & 5 & 4\end{Bmatrix}$. $\displaystyle \begin{Bmatrix}1 \\a\\b\\c\\d\end{Bmatrix}=$ $\displaystyle \begin{Bmatrix}0 \\0\\0\\0\\0\end{Bmatrix}$

I have to use numerical methods NOT analytical methods, that means I can't just multiply it out and solve for a,b,c,d simultaneously. An numerical method would be something like making an original guess for a,b,c,d and using an algorithm to find better estimates and repeating the algorithm until my estimates are accurate much like the Newton-Raphson iteration does. In reality the matrices are 18x20 and 1x18 but I scaled down the problem to be more manageable.

Hi Shakarri! :)

Try it with this problem:

$\displaystyle \begin{Bmatrix}2 & 3 & 5 \\ 2 & 1 & 4 \\ 3 & 1 &11 \\ 3 & 4 & 8 \\ 2 & 5 & 4\end{Bmatrix}$. $\displaystyle \begin{Bmatrix}1 \\a\\b\end{Bmatrix}=$ $\displaystyle \begin{Bmatrix}0 \\0\\0\\0\\0\end{Bmatrix}$

The dimensions match now whereas your problem statement has a mismatch in dimensions.

Moreover this problem is *overdetermined* and won't have a solution (unless I've bungled up).

You can use numerical methods to find values for a and b such that the result is as close to the result vector as possible.

A simple numerical method is to:

1. pick an initial starting point.

2. take a small step in each direction and evaluate how close we get to the result vector.

3. pick the direction that brings us closest.

4. repeat from 2 until no such direction can be found.

More advanced numerical methods for such a minimization are *Powell *and *Levenberg-Marquardt*.

**EDIT**: Oops! I've missed the fact that you were interested in eigenvectors, which was only mentioned in the title.

The problem I sketched and these methods are not applicable to eigenvector determination.

Re: Finding eigenvectors by numerical methods

Here's an alternative numerical method to find eigenvectors.

1. Begin with an initial estimate for an eigenvector.

2. For each dimension take a small step in its direction and also in its opposite direction.

3. Calculate the image and find the angle between the two.

4. Pick the step that reduces the angle the most.

5. Repeat from 2 until no such step can be found.

And of course there are all sorts of numerical algorithms that do the same, but better.

The class of algorithms is called *Singular Value Decomposition* or *SVD *for short.