Linear Algebra Proof Regarding Linear Transformations

Given the following vectors in R^2:

a(sub 1) = (1; -1), a(sub 2) = (2; -1), a(sub 3) = (-3;2),

b(sub 1) = (1;0), b(sub 2) = (0;1), b(sub 3) = (1;1),

determine whether there is a linear transformation T from R^2 into R^2 such that Ta(sub i) = b(sub i) for i = 1, 2 and 3.

Basically, I don't quite understand the process of figuring out linear transformations, so I'd like at least a few helpful pointers in figuring this out. Thank you.

Re: Linear Algebra Proof Regarding Linear Transformations

so assume we have a linear transformation. L which takes

For the matrix A, you can write the 3rd column as the linear combination of the first and second columns.

since a linear transformation has the property L( )= then

if the first column of A went to the first column of B, and the second column of A went to the second column of B then

since this is not equal to the 3rd column of B, there is no Linear transformation which takes the first column of A to the first column of B, second ....