If x=(x1,x2)^T, y=(y1,y2)^T, and z=(z1,z2)^T are arbitrary vectors in R^2 prove that

a) x^Tx (x transpose x) >= 0

So basically I got x1^2 + x2^2 >= 0

Not sure where to go with this proof... thanks

Printable View

- Nov 28th 2007, 06:41 PMpakmanOrthogonality
If x=(x1,x2)^T, y=(y1,y2)^T, and z=(z1,z2)^T are arbitrary vectors in R^2 prove that

a) x^Tx (x transpose x) >= 0

So basically I got x1^2 + x2^2 >= 0

Not sure where to go with this proof... thanks - Nov 28th 2007, 07:15 PMThePerfectHacker