A' = P^{-1}AP and definition of the matrix of a linear transformation

First I'll give an excerpt from Larson's Linear Algebra book and then I'll ask my question. It is something like this:

1. Matrix for $\displaystyle T$ relative to basis $\displaystyle B$ is $\displaystyle A$

2. Matrix for $\displaystyle T$ relative to basis $\displaystyle B'$ is $\displaystyle A'$

3. Transition matrix from $\displaystyle B'$ to $\displaystyle B$ is $\displaystyle P$

4. Transition matrix from $\displaystyle B$ to $\displaystyle B'$ is $\displaystyle P^{-1}$

...there are two ways to get from the coordinate matrix $\displaystyle [v]_{B'}$ to

the coordinate matrix $\displaystyle [T(v)]_{B'}$ One way is direct, using the matrix $\displaystyle A'$ to obtain

$\displaystyle A'[v]_{B'} = [T(v)]_{B'}$

The other way is indirect, using the matrices and to obtain

$\displaystyle P^{-1}AP[v]_{B'} = [T(v)]_{B'}$

But by the definition of the matrix of a linear transformation relative to a basis this implies that: <---My question about this sentence is given below

$\displaystyle A' = P^{-1}AP$

Couldn't it be the case where $\displaystyle P^{-1}AP[v]_{B'} = [T(v)]_{B'}$ but $\displaystyle P^{-1}AP \neq A'$?

Is it possible to kindly tell me the reason how $\displaystyle A' = P^{-1}AP$ follows from the definition of linear transformation relative to a basis?

Does it mean that if two linear transformations are equal their transformation matrices are also equal?

Re: A' = P^{-1}AP and definition of the matrix of a linear transformation

the context in which this arises is important.

often, we are given a matrix (and coordinate vectors) in the standard basis, and for some reason, we wish to use a different basis instead. for example, perhaps we have a basis of eigenvectors (an eigenbasis), in which the matrix for T (which i will A, or A', depending on the basis B, or B' being used) is diagonal (or at least upper triangular), and diagonal matrices are MUCH easier to work with.

so we have T, and the matrix A for T in our standard basis B = {e_{1},e_{2},...,e_{n}} (in other words, [T]_{B} = A).

and then we get a new basis B' = {b_{1},b_{2},...,b_{n}}. what is the new matrix [T]_{B'}?

well, first we need to "change bases". and that is where P comes in. since A only gives "the right numbers" for B-coordinates vectors (our usual x = (x_{1},x_{2},...x_{n}), we need to turn our B'-coordinate vectors x = [y_{1},y_{2},...,y_{n}]_{B'} into B-coordinate vectors. fortunately, P itself is usually easy to write down, we just list the "new" basis vectors (in the "old (standard)" basis) as the columns of P (being sure to get the ORDER correct!). so P[x]_{B'} = x (= [x]_{B}), since in the standard basis, a vector "is itself".

now we have x in the form A knows and loves, and A does whatever it usually does (namemly, transforms x linearly). so now we have x "after A", or Ax, that is to say: [Ax]_{B}. but we need this "back in B'-coordinates", so now we have to "undo" the transformation P, which is exactly what P^{-1} does.

so if the matrix for T is A in some basis B, then the matrix for T in some other basis B' will ALWAYS be P^{-1}AP, for some invertible matrix P. which invertible matrix? the one that changes B' coordinates to B coordinates (the "change of basis" matrix B'-->B).

as a practical matter, one usually doesn't calculate P^{-1} by inverting P (although that IS one way to do it). instead, one finds the form of the standard basis in the "new" basis, and these will form the columns of P^{-1}.

here is a simple example: consider the linear transformation T given by: T(x,y) = (x+3y,3x+y). this has the matrix (in the standard basis) A =

[1 3]

[3 1]. note that A(1,1) = (4,4) = 4(1,1).

so for the subspace of R^{2}, {(a,a): a in R}, T just "stretches a vector by 4". note also that A(1,-1) = (-2,2) = (-2)(1,-1), so for the subspace of R^{2} {(b,-b):b in R}, T just "stretches by 2, and flips (direction)". this suggests that instead of our standard basis B = {(1,0),(0,1)}, we use the new basis B' = {(1,1),(1,-1)}, on which the action of T is "easy to describe".

so first we calculate P, which is easy P =

[1 1 ]

[1 -1].

next we find P^{-1}. if (1,0) = c(1,1) + d(1,-1) = (c+d,c-d), then evidently c-d = 0, so c = d. then 2c = 1 gives us c = 1/2. so (1,0) = [1/2,1/2]_{B'}.

and if (0,1) = h(1,1) + k(1,-1) = (h+k,h-k), then h = -k, so 2h = 1 so h = 1/2, so (0,1) = [1/2,-1/2]_{B'}. thus P^{-1} =

[1/2 1/2 ]

[1/2 -1/2]. so the "new" matrix for T (in this new basis) is:

P^{-1}AP =

[1/2 1/2 ][1 3][1 1 ]

[1/2 -1/2][3 1][1 -1] =

[4 0 ]

[0 -2], which reflects the fact that T "stretches the first basis vector by 4, and stretches the 2nd basis vector by 2 and changes sign".

obviously, this kind of "change of basis" is more advantageous if we have a bigger matrix, but i chose a 2x2 example for clarity's sake.

Re: A' = P^{-1}AP and definition of the matrix of a linear transformation

Thanks a thousands time for taking the time explaining so eloquently(which I couldn't have done) and being to the point. Again thanks.