linear algebra question

• October 1st 2008, 09:51 PM
squarerootof2
linear algebra question
Let A and B be nxn matrices over the field of complex numbers.
How would i show that if B is invertible, then there exists a scalar c in C such that A+cB is not invertible?

i was examining det(A+cB) as given by hints but could not reach a solution.
• October 1st 2008, 10:39 PM
wisterville
Hello,

Use det(XY)=det(X)det(Y).
Find c such that det(AB^{-1}+cE)=0. -c is the eigenvalue of AB^{-1}.

Bye.
• October 2nd 2008, 07:43 AM
ThePerfectHacker
Quote:

Originally Posted by squarerootof2
i was examining det(A+cB) as given by hints but could not reach a solution.

Define the function $f(c) = \det (A+cB)$. Notice that this function is a polynomial of degree $n$. Therefore by the fundamental theorem of algebra there exists a complex solution to $f(c) = 0$. And so $A+cB$ is not invertible.