The matrix A represents a projection P. Find P(V) and P^(-1)(0) such that V = P(V) + P^(-1)(0) for:

(note, the dots are there just to spread out the matrix, and make it a bit more clear)

| 2 . . 1 . . -1| * 1/3 = A

| 1 . . 2 . . .1|

|-1 . . 1 . . .2|

Also give the diagonalized form of P.

The underlying problem I'm having with this question is that I don't really understand projection sets & decomposition. So if you could explain these, with perhaps using this question as an example to your explanation, that would be really awesome for me

Thanks