# Thread: Basis of complex vector space

1. ## Basis of complex vector space

When I have to prove that the subset {(3-i, 2+2i,4),(2,2+4i,3),(1-i,-2i,-1)} is a basis of the complex vector space C3, Am I suppose to find an inverse (to obtain the unique solution)?

2. ## Re: Basis of complex vector space

Hey jojo7777777.

Consider that C as a vector space is isomorphic to R^2 and then use the techniques for R^n to get a basis (Hint: you will have a six dimensional space in R^6 with six vectors in your matrix to row-reduce).

3. ## Re: Basis of complex vector space

My idea was that:
a,b,c∈ℂ to determine d,g,f∈ℂ such that
(a,b,c)=d(3-i,2+2i,4)+g(2,2+4i,3)+f(1-i,-2i,-1) I need to solve for d,g,f to obtain the unique solution, so to show that the given set is a basis.

But how to solve it for d,g,f?

4. ## Re: Basis of complex vector space

Calculate the determinant.

5. ## Re: Basis of complex vector space

well if they are linearly independent, then they span a subspace of C3 of complex dimension 3, which must be C3 itself since:

dimC(C3) = 3.

so use the definition of linear independence:

suppose that a(3-i,2+2i,4) + b(2,2+4i,3) +c(1-i,-2i,-1) = 0 = (0,0,0) = (0+0i,0+0i,0+0i).

we get 3 equations in 3 unknowns (the complex numbers a,b and c):

(3-i)a + 2b + (1-i)c = 0
(2+2i)a + (2+4i)b - (2i)c = 0
4a + 3b - c = 0

this system of equations has a UNIQUE solution (namely : (a,b,c) = (0,0,0)) if and only if the matrix:

$\begin{bmatrix}3-i&2&1-i\\2+2i&2+4i&-2i\\4&3&-1 \end{bmatrix}$

is invertible (non-singular), which you may determine by row-reduction and/or finding the determinant.

i find the determinant to be:

(3-i)(2+4i)(-1) + (2)(-2i)(4) + (1-i)(2+2i)(3) - (1-i)(2+4i)(4) - (-2i)(3)(3-i) - (-1)(2)(2+2i) =

-10-10i -16i + 12 - (24+8i) - (-6-18i) - (-4-4i) =

-10-10i - 16i + 12 + -24-8i + 6+18i + 4+4i = -12-12i ≠ 0, which settles the matter.

6. ## Re: Basis of complex vector space

ie, calculate the determinant. See post 4.

Edit: I believe the original question assumed a knowledge of basis, linear independence, and linear independence of rows and columns of matrices for real F. The point was that they apply to a vector space over a field F, whether F is real or complex. If F is real, rows (or columns) of a matrix are linearly independent if the determinant of the matrix is unequal to zero. The same applies if F is complex, and that's the point.

7. ## Re: Basis of complex vector space

Thank you very much...!!!
Deveno thank you for your step by step explanation!

8. ## Re: Basis of complex vector space

Originally Posted by Hartlw
ie, calculate the determinant. See post 4.

Edit: I believe the original question assumed a knowledge of basis, linear independence, and linear independence of rows and columns of matrices for real F. The point was that they apply to a vector space over a field F, whether F is real or complex. If F is real, rows (or columns) of a matrix are linearly independent if the determinant of the matrix is unequal to zero. The same applies if F is complex, and that's the point.
i agree completely. people often get "so used" to doing linear algebra over the reals that the fact that vector spaces can be defined over ANY field (such as Q, or C, or the field of real rational functions in x, or a finite galois field) gets lost in the mix.

in fact, a lot of linear algebra transfers fairly well to integral domains, but some solutions lie outside of the domain (a system of linear equations with integer coefficients may have fractional solutions, for example).

i did the calculations explicitly, because i wanted to show we had a 3x3 determinant to evaluate, and not a 6x6 one (as suggested in an earlier post, which would have been horrendous to actually calculate. to be fair, that post recommended row-reduction which i would probably prefer in a 6-dimensional situation).

9. ## Re: Basis of complex vector space

I have two questions:
1. Does the row-reduction (when vector space over the complex numbers) obey the "usual rules"- for example the first row will be divided by 3-i?
in fact, a lot of linear algebra transfers fairly well to integral domains, but some solutions lie outside of the domain (a system of linear equations with integer coefficients may have fractional solutions, for example).
2. Does integral domain exclude fractional solutions, isn't it about absence of zero divisors?

10. ## Re: Basis of complex vector space

Originally Posted by jojo7777777
I have two questions:
1. Does the row-reduction (when vector space over the complex numbers) obey the "usual rules"- for example the first row will be divided by 3-i?

2. Does integral domain exclude fractional solutions, isn't it about absence of zero divisors?
1) yes
2) Integral domain has fractional solutions: ab=c has a sol for b if a unequal 0.

An integral domain is a commutative ring with a cancellation law ca=cb → a=b, which requires no divisors of zero:
ca=cb → ca-cb=0 → c(a-b)=0 → c = 0 or (a-b)=0. If c ≠ 0, a=b

a & b are divisors of zero if a ≠ 0 and b ≠ 0 and ab = 0.

Example of divisors of zero (Perlis):
Let A & B be 2X2 matrices with rows:
A = (-2,1) & (2,-1)
B = (2,3) & (4,6)
Then A ≠ 0 and B ≠ 0 but AB=0