Results 1 to 4 of 4

Math Help - Show (1,a,a^2) (1,b,b^2) (1,c,c^2) are linearly independant

  1. #1
    Newbie
    Joined
    Apr 2011
    Posts
    2

    Show (1,a,a^2) (1,b,b^2) (1,c,c^2) are linearly independant

    Hi there,
    I'm attempting to solve the following question: Show {1,a,a^2),(1,b,b^2),(1,c,c^2)} are linearly independant if a, b and c are distinct.

    I'm sure I'm so close!
    What I've come up with so far is putting the three vectors in rows in a matrix, and finding the determinant with row expansion, which gives a(b^2-c^2)+b(c^2-a^2)+c(a^2-b^2)

    I know that they are linearly independant when this determinant doesn't equal zero, and it's simple to show it DOES equal zero if a=b, b=c or c=a. For example, when a=b, b^3-bc^2+bc^2-b^3+c(b^2-b^2)=0

    However, I don't think showing this is sufficient proof that when they are distinct, the determinant will never equal zero.

    Could you point me in the right direction?
    Thanks!
    Last edited by genericguy; April 16th 2011 at 08:32 PM. Reason: fixing latex
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor alexmahone's Avatar
    Joined
    Oct 2008
    Posts
    1,074
    Thanks
    7
    You can factorise the determinant as:
    det=(a-b)(b-c)(c-a)

    The result follows immediately.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Apr 2011
    Posts
    2
    Wow, I was trying so hard to factorise it but missed that completely. :|
    Thanks!
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor Drexel28's Avatar
    Joined
    Nov 2009
    From
    Berkeley, California
    Posts
    4,563
    Thanks
    21
    Quote Originally Posted by genericguy View Post
    Hi there,
    I'm attempting to solve the following question: Show {1,a,a^2),(1,b,b^2),(1,c,c^2)} are linearly independant if a, b and c are distinct.

    I'm sure I'm so close!
    What I've come up with so far is putting the three vectors in rows in a matrix, and finding the determinant with row expansion, which gives a(b^2-c^2)+b(c^2-a^2)+c(a^2-b^2)

    I know that they are linearly independant when this determinant doesn't equal zero, and it's simple to show it DOES equal zero if a=b, b=c or c=a. For example, when a=b, b^3-bc^2+bc^2-b^3+c(b^2-b^2)=0

    However, I don't think showing this is sufficient proof that when they are distinct, the determinant will never equal zero.

    Could you point me in the right direction?
    Thanks!
    Just as a point of interest, in general (1,x_1,s,x_1^n),...,(1,x_n,...,x_n^n) are linearly independent if x_j != x_k k,j in [n] This is the Vandermonde matrix.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Is polynomial of degree 3 linearly independant
    Posted in the Advanced Algebra Forum
    Replies: 5
    Last Post: May 21st 2012, 01:55 AM
  2. Replies: 1
    Last Post: March 9th 2011, 08:42 PM
  3. Replies: 1
    Last Post: October 27th 2010, 05:41 AM
  4. Show Linearly Independent
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: October 18th 2009, 03:19 PM
  5. show that the vectors are linearly dependent
    Posted in the Calculus Forum
    Replies: 8
    Last Post: September 29th 2009, 03:36 PM

/mathhelpforum @mathhelpforum