Matrix of Linear Transformation

Let L:P1 -> P2 be defined by L(p(t)) = tp(t) + p(0). Consider the ordered bases S = {t, 1} and S' = {t+1, t-1} for P1, and T = {t^2, t, 1} and T' = {t^2 + 1, t-1, t+1} for P2. Find the representation of L with respect to

a)S and T

b) S' and T'

c) Find (-3t - 3) using the definition of L and the matrices obtained in parts a and b.

So I've got the first two parts down and the first half of the third part.

For a, I figured out that L(t) = t^2 and L(1) = t+1

The vectors I get for that (in column form): [1 0 0] and [0 1 1].

For b, I got that L(t+1) = t^2 + t + 1 and L(t-1) = t^2 - t - 1

The vectors I get for that are [1 1/2 1/2] and [1 1/2 -3/2]

For c, I figured out the first part. L(-3t-3) = -3t^2 - 3t -3 so

If I use the matrix from a, and multiply it with [-3, -3], I get the answer.

However when I try to solve this using the matrix from b, I get a totally different answer. Is there something I'm missing. I've been staring at this problem for 30 minutes and I can't seem to figure out what I did wrong! Thanks guys!