Definition of determinant and connection to linear dependance of columns

Hi,

In my book on linear algebra (first year university) the author starts off explaining the concepts of determinants by focusing on dimensions n<=3, giving geometrical definitions of area and volume. Then he goes on to explain that if the area/volume given by the columns vectors is zero obviously this means there is linear dependance in these columns and thus det A=0 is equivalent to AX=0 having non-trivial solutions.

I'm with him so far. But moving into higher dimensions, he suddenly gets very ... vague... So I've googled the net, looked at wikikpedia, but I still haven't really understood for n>3 what I understood for n<=3. Wikipedia gives ways to calculate determinants for n>3, but no real definition, at least not anything where the important implications of the determinant being zero or not have for systems of equation become clear.

If anyone could help me with this I'd be very happy! In summary:

1. Definition of determinant for any n.

2. Validity of implications of Det A=0 or Det A<>0 true for n<=3 also for n>3.

Thank you,

Yair

Re: Definition of determinant and connection to linear dependance of columns

Well, here is the most general possible definition of "determinant". Given an n by n matrix, construct all possible products taking exactly one number from each row and column. It is easy to see that there are n! such products- you can choose any of the n numbers in the first column as the first number, then choose any number in the second column **except** the one in the row you chose before; n- 1 choices; then any of the numbers in the thirde column except the ones in those two previously chosen rows; n- 2 choices; etc.

Each such product can be written ordered by column: . Because there is exactly one number from each column, the " " of second indices is a permutation of "1 2 3... n". Multiply each product by 1 if that is and **even** permutation and by -1 if it is an **odd** permutation and add.

(Fortunately, there are much better ways of actually calculating the determinant.)

Those positive and negative terms will cancel, giving a determinant of 0 if and only if the odd and even permutations contain the same " " which is the same as saying one row or column is a multiplie of another.

Re: Definition of determinant and connection to linear dependance of columns

Thank you for the answer!

I've seen this definition, just that I didn't really think of it as a definition but rather as a way of calculating the determinant, just like in the case of n=3 the definition is (or so I thought) a geometrical one, but the way of calculating it is through f.ex. Carrus rule...

But ok, I see what you mean. What you wrote is a definition (you probably meant that in the end you add all the products), and then you can show that if the sum of the determinant calculated this way equals zero then there must be a linear relationship between the rows/columns.

So the reason we use the same term for the geometrically motivated definition in the case of n<=3 and for the "technical" definition in the case of n>3 is because both have the same implications for systems of linear equations? I mean, since your definition doesn't seem to have anything in common with the pedagogical definition for n<=3, and in fact it just seems .... technical, then surely it must've been chosen because of the implications it can be shown to have, no?

Also, could you show me why it is so that if the sums cancel out then there must be multicolinearity?

Again, thanks a lot!

/Yair

Re: Definition of determinant and connection to linear dependance of columns

The properties of the determinant are somewhat painful to establish.

It is a **theorem** that the determinant of a matrix is equal to the (signed) volume of the parallepiped spanned by the rows or colums of this matrix. It is by no means trivial. It is true also in higher dimensions. (You may take it as a definition of what a "volume" is.)

The proof is by showing that the determinant is fully characterized by some of its properties (it is the unique alternating, multilinear form on which takes the value on the identity matrix). Then you can show that the "signed volume form" has the same properties, and hence the two must be equal.