Hey Osp85.

For this try and consider that if (A^(-1) + B^(-1)) is non-singular then (A^(-1) + B^(-1))^(-1) exists.

We know that A^(-1) exists since A is non-singular. We also know that B^(-1) exists since B is non-singular.

We know that (A+B)^(-1) exists since (A+B) is nonsingular.

Let det(A) = a, det(B) = b, det(A+B) = c.

det[A^(-1)] = 1/a, det[B^(-1)] = 1/b and det[A^(-1) + B^(-1)] = c

From this paper

http://www.math.umbc.edu/~gowda/papers/trGOW10-02.pdf

We get 1/a + 1/b <= 1/c

Proof by contradiction: assume that a = -b then 1/a + 1/b = 0 but 1/c can never be zero unless c = infinity which should never happen.

So we rule out this case.

The only other case is that 1/a + 1/b != 0 and thus 1/c is non-zero which implies non-singular matrix.

Thus the statement is proven.

I just realized that another solution could be to assume that 1/c was just non-zero.