Results 1 to 6 of 6

Math Help - The importance of determinants in linear algebra.

  1. #1
    Junior Member
    Joined
    Apr 2008
    Posts
    35

    The importance of determinants in linear algebra.

    In some literature on linear algebra determinants play a critical role and are emphasized in the earlier chapters. (See books by Anton & Rorres, and Lay). However in other literature it is totally ignored until the latter chapters. (See Gilbert Strang).
    How much importance should we give the topic of determinants . I tend to use it to find linear independence of vectors and might extend this to finding the inverse but I think Gauss Jordan and LU might be easier for inverse. Does it have any other uses in Linear Algebra.
    Are there areas where determinants are used and have a real impact? Are there any real life applications of determinants?
    Is there a really good motivating example or explanation which will hook students into this topic?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Super Member
    Joined
    Jun 2012
    From
    AZ
    Posts
    616
    Thanks
    97

    Re: The importance of determinants in linear algebra.

    You can use determinants to find the area of a triangle or the volume of a parallelepiped, given just the coordinates. Also, you use them when dealing with Hessian matrices.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor

    Joined
    Mar 2011
    From
    Tejas
    Posts
    3,401
    Thanks
    762

    Re: The importance of determinants in linear algebra.

    there are LOTS of "real-life" applications of determinants. geometrically, the determinant (well, actually its absolute value) measures what a linear transformation does to the n-dimensional content (= area, for n = 2, and volume, for n = 3) of the unit n-cube. if such a transformation is singular, it effectively "collapes one (or more) faces", reducing the dimensionality of the image, giving an n-dimensional content of 0 (even the largest area gives 0 volume, and even the longest length gives 0 area).

    for small values of n, Cramer's Rule gives a fairly efficient way of expressing a solution to a system of n equations in n unknowns (determinants become unwieldy to compute by hand for n larger than 5, and can be time-consuming even for 4x4 and 5x5 matrices).

    but the real significance of the determinant is that it is a (monoid) homomorphism from End(n,F) (under composition; one can, "for most practical purposes", consider this as Mat(n,F), all nxn matrices with entries from F) to F. if we restrict our attention to invertible linear maps, we get a group homomorphsm from GL(n,F) to F* (F - {0}). furthermore, the determinant map is linear in each "row" or "column" of a matrix, which allows us to use what we know about linear maps to link determinants with row-operations, or column-operations.

    multiplying in the field F is considerably easier than evaluating a determinant. knowing that det(AB) = det(A)det(B) lets us find det(AB) much faster than multiplying A with B, and then computing a determinant. if we have the good fortune that A is diagonalizable, for example, so that PAP-1 = D = diag{d1,...,dn}, we know that:

    det(A) = det(D) = d1d2...dn.

    furthermore, the determinant is involved in an essential way in finding eigenvalues (and thus eigenvectors), and in bases composed of eigenvectors (an eigenbasis), the matrix for a linear transformation is "as simple as it can be" (in the best case, diagonal). taking measurements of vector-valued forces relative to an eigenbasis, greatly simplifies "the coordinates we get" (we've chosen "the right axes of measurement").

    and we can use things we know about polynomials, to learn things about matrices. this is what Cayley-Hamilton is all about. again, for low values of n, this often gives yet another way to express A-1, in terms of powers of A (if the null space of A is {0}, A does not have 0 as an eigenvalue, which means the minimal polynomial of A has a non-zero constant term. put the constant term times I on one side, and we can factor A out of what's left).

    from a teaching point of view, i think determinants are best delayed, until one passes from the "concrete" view of matrices, to the more "abstract" view, of a linear transformation. to be willing to undergo the arduous computation determinants can be, one should have some idea of how this actually *saves* computation, rather than creating it. in other words, concrete linear transformations, with matrices with numerical entries, are "leading examples", and linear transformations are "organizational" a high-level and (con)dense(d) view of things that allow us to make certain deductions before "getting our hands dirty" with actual computation.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Junior Member
    Joined
    Apr 2008
    Posts
    35

    Re: The importance of determinants in linear algebra.

    Thanks so much for your detailed answer.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Junior Member
    Joined
    Apr 2008
    Posts
    35

    Re: The importance of determinants in linear algebra.

    Sorry but on a linear algebra where should determinants be placed?
    Like I sain in my comment - in some literature it is at the beginning whilst in others it is bolted on at the end. I like the idea of checkiing if vectors are independent by using determinants so think they should be placed before independence of vectors.
    What do you think? If you teach a linear algebra course where do you place this topic.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor

    Joined
    Mar 2011
    From
    Tejas
    Posts
    3,401
    Thanks
    762

    Re: The importance of determinants in linear algebra.

    somewhere in the middle. "where exactly" is going to be hard to say, it's going to depend somewhat on who is taking the class. in other words something like this:

    1. Systems of Linear Equations, Matrices, Row-Reduction (Gaussian elimination)
    2. Vector Spaces, Bases, Subspaces, Dimension, Cooordinates
    3. Linear Transformations, Matrix representations, Isomorphisms, Quotient Spaces, Rank-Nullity
    4. Polynomial Spaces, Algebras
    5. Determinants
    6. Eigenvalues, Eigenvectors, Canonical Forms, Spectral Decomposition
    7. Inner Product Spaces, Orthogonalization
    8. Other Topics (Bilinear forms, Hermetian forms, Dual Spaces, Annihilators, etc.)

    depending on whom the course is for, this might be re-ordered somewhat, to either emphasize computational aspects and/or applications, or conceptual ideas and analogies with other kinds of mathematical structures (groups, fields, rings, etc.). for example, one might include a brief section on how the techniques of linear equations can be used to solve linear differential equations, or a brief introduction to modules, or "how to turn any set into the free vector space over it".

    yes, you CAN use determinants to check if a set of vectors is linearly dependent. but for say a set of 6 vectors, just how practical is this? in such a case my first instinct would be to use the definition of linear independence directly. if it was unclear how i could do that, then my next move would be to form a matrix and determine its rank. the complexity of a determinant grows as n!, which is to say, rather fast.

    i think it's important for people to realize that there are "other vector spaces" besides just Rn, that vectors are more than just "arrays" of numbers. if we can "add" (with the usual notions of an identity, associativity and commutativity, and additive inverses) and have a notion of "scaling" (by field elements, with the appropriate "distributive" rules), we have a vector space. and we don't need to know "what" the basis elements are (they might be hard to find, in fact). a little linearity goes a long way. when we know we have a vector space (especially if it is finite dimensional), that tells us a LOT. we can "surf" a long ways just on the rules vector spaces have to obey, without being overly concerned on the "messy details".

    in other words, to appreciate the beauty of: det(A) = 0 iff A is singular, one needs to know exactly what this MEANS. to say it is only a statement about "matrices" is to sell short how much determinants can DO. perhaps later, a student of linear algebra will be taking a course on differential forms, in which determinants play an important role (for a surface, the determinant of the jacobian tells us the "scaling factor" for the area element of a local coordinate chart). i think it proper to define the basic terms of the scope of linear algebra FIRST (and illustrate that this captures "systems of linear equations" as a small part of this scope), before introducing determinants. this may be most students first exposure to a thing defined by axioms. the power and importance of this should not be down-played.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Replies: 5
    Last Post: December 15th 2010, 01:32 AM
  2. Importance of Cauchy convergence
    Posted in the Differential Geometry Forum
    Replies: 2
    Last Post: February 27th 2010, 06:21 AM
  3. linear Algebra: Properties of Determinants
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: October 18th 2007, 01:00 AM
  4. Linear Algebra: Determinants
    Posted in the Advanced Algebra Forum
    Replies: 8
    Last Post: September 26th 2007, 06:19 AM
  5. Replies: 1
    Last Post: September 30th 2006, 11:56 AM

Search Tags


/mathhelpforum @mathhelpforum