If I am understanding you correctly, this is not true.
For example, suppose . It is easy to see that A has eigenvalues 2 and 3.
Now, take , a diagonal matrix with determinant 1.
which does NOT have 2 and 3 as eigenvalues.
Suppose I have this:
And then there is a diagonal matrix such that its determinant is always equals to 1:
Then, for some reason, is always true.
How can I show that for the lambda, which is the eigenvalues matrix, does not change for all ?
And what is the relationship between and ? I understand that their determinant but still, what's the relationship between the and because is eigenvectors matrix for while is eigenvectors matrix for and so they are different. But still, there isn't any strong relationship, is there?
Thanks!
If I am understanding you correctly, this is not true.
For example, suppose . It is easy to see that A has eigenvalues 2 and 3.
Now, take , a diagonal matrix with determinant 1.
which does NOT have 2 and 3 as eigenvalues.
hmm...This is weird. Earlier on, I tried having several matrices having my and then many different matrices for such as , , , etc etc, they all worked.
After your example, I tried more values into the matrices and began to realise that it doesn't work all the time. I'm curious if there is any kind of special properties that allow this proposition to be true?