# Thread: Gaussian Elimination, Horizontal/Vertical vectors?

1. ## Gaussian Elimination, Horizontal/Vertical vectors?

Hello people,
First things first, I'm not a native English speaker, so although I'm pretty sure my day-to-day "conversation" English is fine, I'm not so sure about my "mathematical" English, so bear with me and fix me where possible, thank you!

I'm going through my first linear algebra course, and although I've finished with the course, I'm still seeking some clarifications on some things.

For example, when searching for basis, solving systems of linear equations etc', I usually use horizontal vectors. However, I saw that in some situations, people are using vertical vectors to solve problems, could you clarify when it is better to use which way?
I don't know why but somehow using vertical vectors "does not make sense" to me, as it seems to me that if I'm using elementary row operations I'm just "scrambling" the system.

Mark.

2. ## Re: Gaussian Elimination, Horizontal/Vertical vectors?

Can you give an example? It is not perfectly clear to me, what isn't clear to you .

And are you familiar with the transposed matrix?

3. ## Re: Gaussian Elimination, Horizontal/Vertical vectors?

for example there are vectors v1 v2 v3 v4 which correspond to a system of linear equations

I've seen people use both

v1 v1 v1 v1
v2 v2 v2 v2
v3 v3 v3 v3
v4 v4 v4 v4

and

v1 v2 v3 v4
v1 v2 v3 v4
v1 v2 v3 v4
v1 v2 v3 v4

matrice arrangements, and I'd like to know what's the difference, when to use what.
I wish I'd know how to use that fancy text to standard math converter of yours, is there an explenation page on how to use it?

edit: Yes I know what's a transposed matrix, but the only substantial property of a transpose that we were taught is that Det(A) = Det(A(t)).

4. ## Re: Gaussian Elimination, Horizontal/Vertical vectors?

one can regard a matrix as a linear transformation that takes one kind of vector to another through matrix multiplication.

if your matrix A is mxn, and you multiply the vector v = (v1,v2,....,vn) by the matrix A on the left, to have this make sense,

you have to write v as an nx1 matrix (column vector).

of course, there is an equivalent way of looking at things by considering right multiplication by A^T, in which case you would write

v as a 1xn matrix (a row vector). since matrix multiplication is not commutative (AB ≠ BA), we essentially have to make a choice as to

whether we "write mappings on the left or right", that is, is the image under A of v, A(v), or (v)A^T? this choice is somewhat arbitrary,

but convention is to "write mappings on the left".

the rank of a matrix A, is totally independent of whether you use columns, or rows, to establish linear independence. the standard procedure

is to use row-equivalence to establish column independence, but it is possible to use column-equivalence to establish row independence.

there is a tutorial on using Latex in this thread: http://www.mathhelpforum.com/math-he...ial-19060.html

note, however, that recent changes to the forum means that Latex has to be wrapped with "tex" tags now, instead of "math" tags,

as detailed here: http://www.mathhelpforum.com/math-he...uncements.html