# Linear mappings and vector spaces proof

• Feb 16th 2011, 11:33 AM
hmmmm
Linear mappings and vector spaces proof
let $T:U\rightarrow{V}$ be a linear mapping from vector spaces U to V and let X be a subspace of U. Show that

$T(X)=\{v\epsilonV\epsilonV|v=Tx\ \mbox{for some}\ x\epsilonX\}$ is a subspace of V

I can understand why this is, it seems pretty trivial but I am not sure how you would go about proving it.

Thanks for any help
• Feb 16th 2011, 11:38 AM
HallsofIvy
I presume you mean $T(X)= \{v| v= Tx for some x\in X\}$. You don't "prove it"- that is the definion of T(X)
• Feb 16th 2011, 11:40 AM
Ackbeet
I think you meant

$T(X)=\{v\epsilonV\epsilonV|v=Tx\ \mbox{for some}\ x\in X\},$ right? (Note the capital X on the LHS.)

I think I'd need a bit more background in order to understand your problem, because this is probably how I would define the set on the LHS. How does your book or professor define the LHS normally?
• Feb 16th 2011, 11:57 AM
hmmmm
oh sorry I was meant to prove that $T(X)$ is a subspace of V sorry about that
• Feb 16th 2011, 12:00 PM
Ackbeet
So, you could just show that it's closed under scalar multiplication and vector addition, and that the candidate subspace T(X) contains the zero vector. Then you're done, right? So how does this look for you?
• Feb 16th 2011, 12:06 PM
hmmmm
Yeah but as T is defined as a linear mapping then these are satidfied? and as U is a subspace it contains 0 which maps to 0 so I am done? or am I missing something?

Thanks for the help
• Feb 16th 2011, 12:16 PM
Ackbeet
Quote:

Originally Posted by hmmmm
Yeah but as T is defined as a linear mapping then these are satidfied? and as U is a subspace it contains 0 which maps to 0 so I am done? or am I missing something?

Thanks for the help

Well, I think you should write out the equations that show this. I agree that the linearity of T gets the job done, but that's precisely what you're asked to show. So, how would you write it out?
• Feb 17th 2011, 05:08 AM
HallsofIvy
Suppose u and v are in T(X). That is, there exist x in X such that u= T(x) and there exist y in X such that v= T(y). Now u+ v= T(x)+ T(y)= T(x+ y).

Suppose u is in T(X) and a is any scalar. Then there exist x in X such that u=T(x). Now au= aT(x)= T(ax).

Do you see how those prove that u+ v and au are in T(X)?