Originally Posted by

**Gchan** I apologize if my notation isn't clear, newbie to this forum

I'm trying to find out how to find the square root of a 3x3 matrix.

For A=

[1, 1, 1

0, 1, 1

0 , 0, 1]

I know that, in general, A^x = (P^-1) (D^x) (P) for some invertible P. In the case of linearly independent eigenvectors P should form a basis of A's eigenspace. But, the eigenvalues of A here are all 1, and only has one eigenvector, [1, 0, 0] and its scalar multiples. So that method isn't going to work.

There is a method using Spectral Decomposition that I don't fully understand. It starts with the equation

for A nxn, eigenvalues v1....vs, multiplicities m1....ms, then there exists n uniquely defined consituent matrices E i,k: i = 1...s, k = 0.... m-1

s.t. for any analytic function f(x) we have

f (A) = (s sigma i = 1) (mi sigma k =0) f^(k) (vi) E i,k

Anyways if you can decode that it seems to me you can arrive at the constituent matrices of A by the following equations

(A-I) (A-I) = 0 + 0 + 2 E 1,0

(A-I) (A-I) = 2 E 1,2

which works out to

[ 0, 0, 1/2

0, 0, 0

0, 0, 0 ]

A-I = E 1,1

which is of course

[ 0, 1, 1

0, 0, 1

0, 0, 0]

and finally

I = E1,0

So we have 3 constituent matrices for A, let's say

X E1,0 + Y E 1,1 + Z E 1,2

It turns out for values X=1, Y= 1/2, and Z = -1/4 you get

[ 1, 1/2, 3/8

0, 1, 1/2

0, 0, 1]

whose square is A. So somehow (I don't know) we have to use the constituent matrices in a linear equation to general the square root of A. How to get the values of X,Y,Z I do not know.