Work in M(n, R), the space of nxn matrices over the real numbers.
Find the total derivative at I, the identity matrix. That is, find an expression
f(I+H)-f(I)= linear in H + small (quadratic and above) in H
Thoughts: This is pretty easy to do if you assume you can expand (I+H)^-1 binomially, but a later question I have to do tells me to use Taylor's Theorem to prove that formula, so I obviously can't use it in this question.
So, how on earth do I find (I+H)^-1 -I without expanding the bracket out?
Thanks for any help. Also, sorry if this is in the wrong place. Couldn't decide if it belonged here or in Analysis.