Results 1 to 12 of 12

Math Help - Linear Transformation, kerT question

  1. #1
    Member
    Joined
    Mar 2010
    Posts
    175

    Linear Transformation, kerT question

    T: R^n \rightarrow R
    T(x_1,...,x_n) = x_1 + 2x_2 +...+ kx_k +...+ nx_n for every (x_1,...,x_n) \in R^n

    Find a basis for kerT.

    Attempt:

    First, I chose a basis for R^n: W = ((1,...,0),(0,1,...,0),...,(0,0...1))
    and T(W) = ((1),(2),(3),...,(k),...,(n))

    so basically, I don't know what I'm doing...is there such thing as a basis for R?
    is T(W) really (1+2+3+...+k+...+n)??
    and how do I come up with the representation matrix?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Senior Member
    Joined
    Mar 2009
    Posts
    378
    Yes, there is a basis for \mathbb{R} and it would be the number 1. The problem I see is with your T(w). You didn't use the w that you defined as the basis for \mathbb{R}^n. It should be
    T(w) = (1,\dots,0) + 2(0,1,\dots,0) + \cdots + k(0,\dots,0,1,0,\dots,0) + \cdots + n(0,0,\dots,1) = (1,2,\dots,k,\dots,n).
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Member
    Joined
    Mar 2010
    Posts
    175
    Quote Originally Posted by lvleph View Post
    Yes, there is a basis for \mathbb{R} and it would be the number 1. The problem I see is with your T(w). You didn't use the w that you defined as the basis for \mathbb{R}^n. It should be
    T(w) = (1,\dots,0) + 2(0,1,\dots,0) + \cdots + k(0,\dots,0,1,0,\dots,0) + \cdots + n(0,0,\dots,1) = (1,2,\dots,k,\dots,n).
    With my luck, I actually happened to see this before you edited it. Why did you take away the matrix and all that extra stuff??

    so after I get (1,2,...,k,...,n) where do I continue? How do I turn this into a matrix??
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Senior Member
    Joined
    Mar 2009
    Posts
    378
    I took it away, because it may not be a matrix. I thought about it more and decided that it was better not to post that. Unfortunately, I don't have my book near me so that I can look this up. I will see what I can find to be sure how to answer you.

    My suggestion is to think of what happens when you multiply a matrix and a vector together, you get a linear combination of the rows of the matrix. Well, if we are multiplying on the right. We could multiply on the left and then we get linear combinations of the columns. This is why I decided to not post what I had. It wasn't completely clear which side we were acting on. Let me read some and get back to you.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Senior Member
    Joined
    Mar 2009
    Posts
    378
    Okay, it looks like I had the right idea, but was wrong in my process.

    By your definition of the linear transformation, given a vector \alpha = (x_1,x_2,\dots,x_n) the linear transformation T is given by
    T\alpha = x_1 + 2x_2 + \cdots + kx_k + \cdots + nx_n.
    So we are transforming a vector in \mathbb{R}^n to a scalar in \mathbb{R}. So what does this transform T look like? First, we must realize that we are multiplying on the left. Thus, T must be n \times 1 since \alpha is 1 \times n and we need a scalar.

    Now remember in my previous post that when we multiply vectors and matrices we get linear combinations. In this case our matrix is actually a vector and the linear combination is T\alpha = x_1 + 2x_2 + \cdots + kx_k + \cdots + nx_n. Thus, T is given by
    \begin{bmatrix}1 \\ 2 \\ \vdots \\ k \\ \vdots \\ n\end{bmatrix}.

    Now you can see why I delete my previous post, it was wrong.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Member
    Joined
    Mar 2010
    Posts
    175
    Quote Originally Posted by lvleph View Post
    Okay, it looks like I had the right idea, but was wrong in my process.

    By your definition of the linear transformation, given a vector \alpha = (x_1,x_2,\dots,x_n) the linear transformation T is given by
    T\alpha = x_1 + 2x_2 + \cdots + kx_k + \cdots + nx_n.
    So we are transforming a vector in \mathbb{R}^n to a scalar in \mathbb{R}. So what does this transform T look like? First, we must realize that we are multiplying on the left. Thus, T must be n \times 1 since \alpha is 1 \times n and we need a scalar.

    Now remember in my previous post that when we multiply vectors and matrices we get linear combinations. In this case our matrix is actually a vector and the linear combination is T\alpha = x_1 + 2x_2 + \cdots + kx_k + \cdots + nx_n. Thus, T is given by
    \begin{bmatrix}1 \\ 2 \\ \vdots \\ k \\ \vdots \\ n\end{bmatrix}.

    Now you can see why I delete my previous post, it was wrong.
    Thanks. That seems right...but we still have not gotten anywhere with the question. How do we find a basis for kerT???
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Senior Member
    Joined
    Mar 2009
    Posts
    378
    Oh yeah, sorry. Now we want to find a basis for the space of vectors that are sent to 0 by T, i.e., we want T\alpha = x_1 + 2x_2 + \cdots + kx_k + \cdots + nx_n = 0. If n were even then the easiest way I could think of doing this would be easy, all we need is one vector and then split it into a linear combination of vectors. One such vector would be
    \left(-n, \frac{1-n}{2}, \frac{2-n}{3}, \cdots, \frac{-k}{k-1}, 1, \cdots, 1\right)
    and so one such basis is
    \left\{\left(-n,0,0\cdots,0\right),\left(0,\frac{1-n}{2},0,\cdots,0\right),\left(0,0,\frac{2-n}{3},\cdots\,0\right),\cdots, \left(0,\cdots,\frac{-k}{k-1}, 0,\cdots,0\right), \left(0,\cdots,0,1,0,\cdots,0\right), \cdots \left(0,\cdots,0,1\right)\right\}
    If n is odd we could just require one of the indices to be zero and then do everything else in almost he same way.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Senior Member
    Joined
    Mar 2009
    Posts
    378
    I had a dream about this problem and realize the basis for the kernel that I gave you was wrong.

    If a vector is to be in the kernel, the transformation T must send it to 0. However, the individual vectors I gave you are not sent to zero.
    The following vectors are sent to zero
    \left\{\left(-n,0,0\cdots,1\right),\left(0,\frac{1-n}{2},0,\cdots,0,1,0\right),\left(0,0,\frac{2-n}{3},\cdots,0,1,0,0\right),\cdots, \left(0,\cdots,\frac{-k}{k-1},1,0 ,\cdots,0\right)\right\}\quad n \in 2\mathbb{Z}^+.
    I am unsure if this is the complete basis though. It seems like I might be missing something. Anyway, this is the idea behind it.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    MHF Contributor

    Joined
    Apr 2005
    Posts
    15,693
    Thanks
    1466
    A variation on Ivleph's solution (because I don't like fractions!):
    T(\alpha)= x_1+ 2x_2+ 3x_3+ \cdot\cdot\cdot+ nx_n= 0 so x_1= -2x_2- 3x_3- \cdot\cdot\cdot- nx_n.

    That means we can write (x_1, x_2, x_3, \cdot\cdot\cdot, x_n) as (-2x_2- 3x_3-\cdot\cdot\cdot- nx_n, x_2, x_3, \cdot\cdot\cdot, x_n) = (-2x_2, x_2, 0, \cdot\cdot\cdot, 0)+ (-3x_3, 0, x_3, \cdot\cdot\codt\, 0)+\cdot\cdot\cdot+ (-nx_n, 0, 0, \cdot\cdot\cdot, x_n) = x_2(-2, 1, 0,\cdot\cdot\cdot, 0)+ x_3(-3, 0, 1, \cdot\cdot\cdot, 0)+\cdot\cdot\cdot+ x_n(-n, 0, 0, \dot\cdot\cdot, 1)

    That should make the basis obvious.

    The dimension of R^ is, of course, n and we have put one condition on the kernel:  x_1+ 2x_2+ 3x_3+ \cdot\cdot\cdot+ nx_n= 0 so the kernel has dimension n- 1.

    We also know that has to be true because of the "rank-nullity theorem". T maps all of R^1 into R and so has rank 1. The nullity must be n-1 so that "1+ (n-1)= n".
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Senior Member
    Joined
    Mar 2009
    Posts
    378
    That is a much better solution, I think.
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Member
    Joined
    Mar 2010
    Posts
    175
    I had a dream about this problem and realize the basis for the kernel that I gave you was wrong.
    I love how you had a dream about this problem. That is so funny...and a bit nerdy
    You guys are awesome. Thanks for your help. I'll review this more when I have time.
    Follow Math Help Forum on Facebook and Google+

  12. #12
    Senior Member
    Joined
    Mar 2009
    Posts
    378
    I think better when I am asleep. My best research is done sleeping.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. A Linear Transformation question
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: November 7th 2010, 12:25 PM
  2. Linear Transformation question
    Posted in the Advanced Algebra Forum
    Replies: 3
    Last Post: October 30th 2010, 10:36 AM
  3. Linear Transformation, KerT, ImT question
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: October 29th 2010, 08:27 AM
  4. Linear transformation question
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: February 23rd 2010, 10:57 AM
  5. Linear Transformation Question.
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: May 23rd 2007, 12:22 AM

Search Tags


/mathhelpforum @mathhelpforum