# Thread: Determine which of the formulas hold for all invertible n x n matrices A and B

1. ## Determine which of the formulas hold for all invertible n x n matrices A and B

The answers are the ones I have checked. I just don't understand at all why they are correct. I also do not understand why the rest are wrong although I do have some suspicions. I do know for a fact that (AB)^(-1) = B^(-1) A^(-1) so C is automatically wrong but for B, is it wrong because of the order? Could someone please explain or elaborate if I'm correct? Also, I have no idea why D or E are wrong.

Any input would be greatly appreciated!
Thanks in advance!

2. First, to show that B through E are wrong, formally one has to come up with counterexamples. For example, the fact that $\displaystyle (AB)^{-1}=B^{-1}A^{-1}$, while C says $\displaystyle (AB)^{-1}=A^{-1}B^{-1}$ and there is no obvious way to prove that $\displaystyle B^{-1}A^{-1}=A^{-1}B^{-1}$ does not automatically imply that C is false. Chances are, however, that random 2 x 2 matrices with integer coefficients from 0 to 3 will provide a counterexample. You can do computations in WolframAlpha; see this help page. (Though, if you will have tests where you can't use computers, it is highly recommended to get used to do computations by hand.)

A. $\displaystyle (ABA^{-1})^4=(ABA^{-1})(ABA^{-1})(ABA^{-1})(ABA^{-1})=$$\displaystyle AB(A^{-1}A)B(A^{-1}A)B(A^{-1}A)BA^{-1}=AB^4A^{-1}$

For B through D, I'll just show where potential proofs don't go through. Matrix multiplication is in general non-commutative. In B, $\displaystyle ABA^{-1}\ne AA^{-1}B=B$. In fact, you'll learn that $\displaystyle ABA^{-1}$ gives the same linear transformation as $\displaystyle B$ but in a different basis. C was discussed above. In D, $\displaystyle (A+B)^2=A^2+AB+BA+B^2$, but again there is no reason why $\displaystyle AB+BA=2AB$.

For E, consider, for example $\displaystyle A=\begin{pmatrix}0&2\\2&3\end{pmatrix}$. Finally, $\displaystyle (A^5B^8)^{-1}=(B^8)^{-1}(A^5)^{-1}=(B^{-1})^8(A^{-1})^5$.

3. For E, is there an algebraic way to disprove it? For D, is it false because we cannot assume that AB = BA? If we did assume such equality, what does it mean? I also don't get what you did for A after multiplying it by itself four times...did you factor? Is this "LU Factorization"? I don't think I covered that. If not, what is it?

4. For E, is there an algebraic way to disprove it?
E claims that for every matrix A, A + I is invertible. This claim is false, so its negation is true. Its negation is: There exists a matrix A such that A + I is not invertible. This is what one has to prove. A preferable way to prove statements of the form "There exists an A such that ..." is to produce such A. Not only is such proof valid, it is arguably better than proving the same thing by contradiction, i.e., showing that "for every matrix A, A + I is invertible" leads to contradiction.

TL;DR: To prove a statement that starts with "For all", one has to consider all possible objects. To disprove such statement, it is enough to find one counterexample.

For D, is it false because we cannot assume that AB = BA?
Yes.

If we did assume such equality, what does it mean?
Then D would be true. Multiplication of certain classes of matrices, for example, those that have nonzero elements on the diagonal only, is commutative.

I also don't get what you did for A after multiplying it by itself four times...did you factor?
First, I wrote C^4 as C * C * C * C where C is ABA^(-1). Second, I applied the law of associativity (a lot of times) that allows rearranging parentheses. Note that I did not change the order of factors. Finally, I replaced A * A^(-1) with I and replace B * I with B (three times).

5. Thanks! I get it now!

,

# n ..bb.bbbnnc nxn

Click on a term to search for related topics.