# linear algebra proof

• Jul 4th 2009, 06:30 PM
Danneedshelp
linear algebra proof
Q: If $A$ is an idempotent matrix, prove that the determinant of the matrix $A$ is either $0$ or $1$.

A: Suppose $A$ is idempotent. Then, by definition $A^{2}=A$; thus, $A$ has either a column of zeros, a row of zeros, or is the identity matrix (we will omit the trivial case when $A$ is the zero matrix).

Case 1: If $A$ has a row of zeros
$\Rightarrow det(A)= \sum_{j}a_{ij}C_{ij}=C_{i1}0+C_{i2}0+...+C_{im}=0$

Case 2: If $A$ has a column of zeros
$\Rightarrow det(A)=\sum_{i}a_{ij}C_{ij}=C_{1j}0+C_{2j}0+...+C_ {im}=0$

Case 3: If $A$ is the identity matrix, then each $a_{ij}C_{ij}$ of the determinant is equal to $1$ or $0$. Clearly, $a_{ij}C_{ij}=0$ when $a_{ij}=0$ or $M_{ij}=0$. Conversely, $a_{ij}C_{ij}=1$ only if $a_{ij}=1$ and $M_{ij}=1$. Observe that, $M_{ij}=1$ only if $M_{ij}$ is the minor of some $a_{ij}$ along the diagonal. Moreover, $a_{ij}=1$ only appears when $i=j$ (along the diagonal); thus, $C_{ij}=(-1)^{i+j}M_{ij}$ will always be positive. Therefore, regardless of whether you expand by a row or column, $a_{ij}=1$ and $M_{ij}=1$ only when $i=j$ which occurs exactly once in any row or column; as a result, $det(A)=1$ for any $A=I_{n}$.

Does that work? I know you can multiply all the terms of the diagonal to find the determinant if you have an upper or lower traingular, so it makes sense that if $A=I_{n} \Rightarrow det(A)=1$. Even so, I am not sure if I described everything correctly. Also, I am not sure I covered every case, given $A^2=A$.

Thanks
• Jul 4th 2009, 07:27 PM
alunw
There is much easier way if you are allowed to use the fact that det(A*B) = det(A)*det(B) (which is a much more basic fact than the one you used about idempotent matrices having a row or column of zeros).

A*A= A
so det(A) = det(A*A) = det(A)*det(A)

The only solutions of x=x*x in any field are x=0 and x=1, so det(A)=0 or 1.
• Jul 5th 2009, 11:43 AM
Danneedshelp
Quote:

Originally Posted by alunw
There is much easier way if you are allowed to use the fact that det(A*B) = det(A)*det(B) (which is a much more basic fact than the one you used about idempotent matrices having a row or column of zeros).

A*A= A
so det(A) = det(A*A) = det(A)*det(A)

The only solutions of x=x*x in any field are x=0 and x=1, so det(A)=0 or 1.

Wow, that is much easier indeed. Although, I am not too familliar with the term field. I have heard about rings and fields (something to do with a set under one or two operations or something), but I have never been formally introduced to these concepts. Could you maybe explain the last part of your result?

Does my long drawn out proof still hold though?
• Jul 5th 2009, 01:14 PM
alunw
I stated the last part for general fields because you didn't say what kind of matrices you were dealing with. As well as matrices with entries that are real or complex numbers you can form matrices for any field to describe linear (and affine and projective) transformations between spaces. The first part of my proof shows that if x is the determinant of an idempotent matrix it satisfies x*x=x. If you haven't done fields you will be working in the real or complex numbers and it is easy to see that 0 and 1 are solutions. So I am also using the fact that a quadratic equation has at most two roots.
This is true in any field so the result is also good for other kinds of matrix. (Quaternions are an example of what is called a skew field, and quadratic equations can have more than two solutions, so for matrices over quaternions you would have to show that that particular quadratic only has two solutions - which it does)
I'm not sure about your proof - since it starts from something that is not obvious to me, which I would think needs to be proved and which is probably quite hard to prove. I personally very rarely come across idempotent matrices so I don't remember stuff about them, but I use determinants and the fact that det(A*B) = det(A)*det(B) all the time.

Another proof would be as follows if A*A = A then A*x=x where x is any column of A. So if x is non-zero then 1 is an eigenvalue of the matrix. If the matrix is non singular the columns must form a basis of the vector space, the eigenvalues are all, 1 so the determinant is 1 (because the product of the eigenvalues of any matrix is its determinant) If the matrix is singular then its determinant is 0 and it certainly is singular if it has a column of zeros.
• Jul 10th 2010, 12:33 AM
mathisfriendly
Counter example
@ danneedshelp:

You wrote:
Suppose http://www.mathhelpforum.com/math-he...b72eacbe29.png is idempotent. Then, by definition http://www.mathhelpforum.com/math-he...af4ec2e5c7.png; thus, http://www.mathhelpforum.com/math-he...b72eacbe29.png has either a column of zeros, a row of zeros, or is the identity matrix (we will omit the trivial case when http://www.mathhelpforum.com/math-he...b72eacbe29.png is the zero matrix).

Here is an example for an idempotent matrix that has no zero column, no zero row and it is not the identity matrix:

$A := \left( \begin{array} {rrr} 2 & -2 & -4 \\ -1 & 3 & 4 \\ 1 & -2 & -3 \end{array} \right)$

So we all should follow alunw's proof, which is perfect!