Hello, are there any alternative ways to solve system of equations other than gaussian elimination?
An elegant but not very comfortable computationally alternative consists in writing the linear system in matrix form...
$\displaystyle A\ \vec{x}= \vec{b}$ (1)
If $\displaystyle A^{-1}$ is the inverse of $\displaystyle A$, then the solution is...
$\displaystyle \vec{x}= A^{-1}\ \vec{b}$ (2)
Kind regards
$\displaystyle \chi$ $\displaystyle \sigma$
LU decomposition
Cholesky decomposition
QR decomposition
Take your pick.
In particular, Gaussian elimination is numerically a bad method, since it makes significant rounding errors when solving equations that are close to singular, or when they are singular and you want a solution space.
one should also include Cramer's method, which shows why (if the determinant of the system is close to 0) such large errors can occur with Gaussian elimination (the "common denominator" of the system solutions is very small, so even a small error in rounding can cause a huge variance in results).
i want to point out that LU decomposition is, in fact, an extension of Gaussian elimination, and that an LU decomposition may not actually exist. furthermore, the Cholesky decomposition only applies to positive-definite Hermitian systems, which is a rather limited special case. the QR decomposition always exists, but using it to solve a system of linear equations eventually involves finding the inverse of the transpose of a submatrix of the "R" part (which typically involves using...Gaussian elimination).
the solvability of a system Ax = b is intimately bound up with "how singular" A is, so knowing the rank of A is useful.
A method that may be useful is some cases is by using a generalized inverse of the matrix (there are plenty of different generalized inverses, this one is very simple): if you have a matrix X such that AXA = A, and an equation Ax = b, then first check whether Xb is a solution. If it's not, then there aren't any solutions. If it is, then the general solution is x = Xb + (I - XA)y for any y in your space.
Why do you want to know? Is it out of interest, or is there a specific thing you want to do? Some methods are faster than others, while others are less suseptible to rounding errors.
Gaussian elimintation has order $\displaystyle n^3$ - it gets slower with the cube of the rank. That is, it slows down exponentially. However, there are algorithms which work faster for matrices with lots of zeros ("sparse" matrices). For instance, a frontal solver.
You might want to read this bit of Wikipedia.
(I thought there would be some good reading on how to solve linear equations in Knuths "the Art of Computer Programming" (aka "all the algorithms you'll ever need and more, explained") but I can't seem to find anything. There is a bit in the second volume, but it only mentions it in passing...am I being stupid, or does he really not mention this stuff?!?)
Let me put it this way, I want a faster method than gaussian elimination, we get a lot of problems with parameter solutions coming in the exams, We dont have much time thats the problem, Gauss takes a lot of time to do calculations with.
I am familiar with cramers rule for solving, but its only applicable for n x n, which I think our teacher wouldnt put any questions, (very small chance).
So a method thats faster than gauss, but which can handle parameter solutions too(,s,t), is there something like this?, or is gauss the only thing I have go with?, I thought that perhaps some intelligent professor, had come up with some good algorithm by now
But Gaussian elimination is good, and it is fast for small matrices. You only need to do, approximately, $\displaystyle n^3/3$ calculations, which for a 3x3 matrix means you only need to do three calculations:
$\displaystyle \left( \begin{array}{ccc}a & b & c \\d & e & f \\g & h & i \end{array} \right)$
$\displaystyle \rightarrow\left( \begin{array}{ccc}a & b & c \\0 & e_1 & f_1 \\g & h & i \end{array} \right)$
$\displaystyle \rightarrow\left( \begin{array}{ccc}a & b & c \\0 & e_1 & f_1 \\0 & h_2 & i_2 \end{array} \right)$
$\displaystyle \rightarrow\left( \begin{array}{ccc}a & b & c \\0 & e_2 & f_2 \\0 & 0 & i_3 \end{array} \right)$
Specifically, $\displaystyle ar_2-dr_1$ then $\displaystyle ar_3-gr_1$ and finally $\displaystyle e_1r_3-h_2r_2$.