Then, you have to find the vectors
Let's do it for (for , it's common techniques !)
That is :
From (2), you get
Substitute in (3) :
Now let for example a=1 (that's what's good with eigenvectors, you can choose arbitrarily one component)
And now compute
Similar one for
For , it should be easy, because all the coefficients are real.
Once you get the 3 eigenvectors, place them in columns to form matrix such that
(the order of the eigenvalues has to be the same as the order of the eigenvectors in P)
Just want to confirm my thought process on backsolving for A given an exponential matrix .
A matrix of sums is also a sum of matrices, and vice versa. Right? For example,
would it go like this?
Once I find my , just plug in n=1 to get A. Right?
The n-th power of a matrix is not the matrix formed by the elements to the n-th power ~
Do you have ?
You will see that it's not the case.
You don't need to find A, you need to find an expression for
And by finding its eigenvalues and the transition matrix, P, you'll have :
But as I said above, the n-th power of a diagonal matrix is a matrix formed by the n-th power of its diagonal elements.
So finally, you have :
Does it look clear ?