# Thread: Linear Transformation, kerT question

1. ## Linear Transformation, kerT question

$T: R^n \rightarrow R$
$T(x_1,...,x_n) = x_1 + 2x_2 +...+ kx_k +...+ nx_n$ for every $(x_1,...,x_n) \in R^n$

Find a basis for kerT.

Attempt:

First, I chose a basis for $R^n$: W = ((1,...,0),(0,1,...,0),...,(0,0...1))
and T(W) = ((1),(2),(3),...,(k),...,(n))

so basically, I don't know what I'm doing...is there such thing as a basis for R?
is T(W) really (1+2+3+...+k+...+n)??
and how do I come up with the representation matrix?

2. Yes, there is a basis for $\mathbb{R}$ and it would be the number 1. The problem I see is with your $T(w)$. You didn't use the $w$ that you defined as the basis for $\mathbb{R}^n$. It should be
$T(w) = (1,\dots,0) + 2(0,1,\dots,0) + \cdots + k(0,\dots,0,1,0,\dots,0) + \cdots + n(0,0,\dots,1) = (1,2,\dots,k,\dots,n)$.

3. Originally Posted by lvleph
Yes, there is a basis for $\mathbb{R}$ and it would be the number 1. The problem I see is with your $T(w)$. You didn't use the $w$ that you defined as the basis for $\mathbb{R}^n$. It should be
$T(w) = (1,\dots,0) + 2(0,1,\dots,0) + \cdots + k(0,\dots,0,1,0,\dots,0) + \cdots + n(0,0,\dots,1) = (1,2,\dots,k,\dots,n)$.
With my luck, I actually happened to see this before you edited it. Why did you take away the matrix and all that extra stuff??

so after I get (1,2,...,k,...,n) where do I continue? How do I turn this into a matrix??

4. I took it away, because it may not be a matrix. I thought about it more and decided that it was better not to post that. Unfortunately, I don't have my book near me so that I can look this up. I will see what I can find to be sure how to answer you.

My suggestion is to think of what happens when you multiply a matrix and a vector together, you get a linear combination of the rows of the matrix. Well, if we are multiplying on the right. We could multiply on the left and then we get linear combinations of the columns. This is why I decided to not post what I had. It wasn't completely clear which side we were acting on. Let me read some and get back to you.

5. Okay, it looks like I had the right idea, but was wrong in my process.

By your definition of the linear transformation, given a vector $\alpha = (x_1,x_2,\dots,x_n)$ the linear transformation $T$ is given by
$T\alpha = x_1 + 2x_2 + \cdots + kx_k + \cdots + nx_n$.
So we are transforming a vector in $\mathbb{R}^n$ to a scalar in $\mathbb{R}$. So what does this transform $T$ look like? First, we must realize that we are multiplying on the left. Thus, $T$ must be $n \times 1$ since $\alpha$ is $1 \times n$ and we need a scalar.

Now remember in my previous post that when we multiply vectors and matrices we get linear combinations. In this case our matrix is actually a vector and the linear combination is $T\alpha = x_1 + 2x_2 + \cdots + kx_k + \cdots + nx_n$. Thus, $T$ is given by
$\begin{bmatrix}1 \\ 2 \\ \vdots \\ k \\ \vdots \\ n\end{bmatrix}$.

Now you can see why I delete my previous post, it was wrong.

6. Originally Posted by lvleph
Okay, it looks like I had the right idea, but was wrong in my process.

By your definition of the linear transformation, given a vector $\alpha = (x_1,x_2,\dots,x_n)$ the linear transformation $T$ is given by
$T\alpha = x_1 + 2x_2 + \cdots + kx_k + \cdots + nx_n$.
So we are transforming a vector in $\mathbb{R}^n$ to a scalar in $\mathbb{R}$. So what does this transform $T$ look like? First, we must realize that we are multiplying on the left. Thus, $T$ must be $n \times 1$ since $\alpha$ is $1 \times n$ and we need a scalar.

Now remember in my previous post that when we multiply vectors and matrices we get linear combinations. In this case our matrix is actually a vector and the linear combination is $T\alpha = x_1 + 2x_2 + \cdots + kx_k + \cdots + nx_n$. Thus, $T$ is given by
$\begin{bmatrix}1 \\ 2 \\ \vdots \\ k \\ \vdots \\ n\end{bmatrix}$.

Now you can see why I delete my previous post, it was wrong.
Thanks. That seems right...but we still have not gotten anywhere with the question. How do we find a basis for kerT???

7. Oh yeah, sorry. Now we want to find a basis for the space of vectors that are sent to $0$ by $T$, i.e., we want $T\alpha = x_1 + 2x_2 + \cdots + kx_k + \cdots + nx_n = 0$. If $n$ were even then the easiest way I could think of doing this would be easy, all we need is one vector and then split it into a linear combination of vectors. One such vector would be
$\left(-n, \frac{1-n}{2}, \frac{2-n}{3}, \cdots, \frac{-k}{k-1}, 1, \cdots, 1\right)$
and so one such basis is
$\left\{\left(-n,0,0\cdots,0\right),\left(0,\frac{1-n}{2},0,\cdots,0\right),\left(0,0,\frac{2-n}{3},\cdots\,0\right),\cdots, \left(0,\cdots,\frac{-k}{k-1}, 0,\cdots,0\right), \left(0,\cdots,0,1,0,\cdots,0\right), \cdots \left(0,\cdots,0,1\right)\right\}$
If $n$ is odd we could just require one of the indices to be zero and then do everything else in almost he same way.

If a vector is to be in the kernel, the transformation T must send it to 0. However, the individual vectors I gave you are not sent to zero.
The following vectors are sent to zero
$\left\{\left(-n,0,0\cdots,1\right),\left(0,\frac{1-n}{2},0,\cdots,0,1,0\right),\left(0,0,\frac{2-n}{3},\cdots,0,1,0,0\right),\cdots, \left(0,\cdots,\frac{-k}{k-1},1,0 ,\cdots,0\right)\right\}\quad n \in 2\mathbb{Z}^+$.
I am unsure if this is the complete basis though. It seems like I might be missing something. Anyway, this is the idea behind it.

9. A variation on Ivleph's solution (because I don't like fractions!):
$T(\alpha)= x_1+ 2x_2+ 3x_3+ \cdot\cdot\cdot+ nx_n= 0$ so $x_1= -2x_2- 3x_3- \cdot\cdot\cdot- nx_n$.

That means we can write $(x_1, x_2, x_3, \cdot\cdot\cdot, x_n)$ as $(-2x_2- 3x_3-\cdot\cdot\cdot- nx_n, x_2, x_3, \cdot\cdot\cdot, x_n)$ $= (-2x_2, x_2, 0, \cdot\cdot\cdot, 0)+ (-3x_3, 0, x_3, \cdot\cdot\codt\, 0)+\cdot\cdot\cdot+ (-nx_n, 0, 0, \cdot\cdot\cdot, x_n)$ $= x_2(-2, 1, 0,\cdot\cdot\cdot, 0)+ x_3(-3, 0, 1, \cdot\cdot\cdot, 0)+\cdot\cdot\cdot+ x_n(-n, 0, 0, \dot\cdot\cdot, 1)$

That should make the basis obvious.

The dimension of $R^$ is, of course, n and we have put one condition on the kernel: $x_1+ 2x_2+ 3x_3+ \cdot\cdot\cdot+ nx_n= 0$ so the kernel has dimension n- 1.

We also know that has to be true because of the "rank-nullity theorem". T maps all of $R^1$ into R and so has rank 1. The nullity must be n-1 so that "1+ (n-1)= n".

10. That is a much better solution, I think.