# Questions about vector spaces, matrices, and eigenvalues

• Apr 19th 2011, 03:51 PM
sj247
Questions about vector spaces, matrices, and eigenvalues
Hiya, I'm a bit stuck and I need help answering these short questions:

- Suppose that (V,+,.) is a vector space over a field F & S= {V1,V2,...,Vk) is a subset of V. Describe the span of S. Explain how to determine whether S is linearly independant.

- Explain what it means for W to be a subspace of (V,+,.)

- What is meant by Y:V -> W is an F-linear transformation

- Defne what is meant by saying a matrix is diagonalisable, describe the connection wih the eigenvectors and eienvalues of the matrix.

- Let V denote a real vector space & B:VxV->R denote a real bilinear form of V. Describe what two additional properties B must satisfy in order to be an inner product on V

Thanks, Steve
• Apr 19th 2011, 04:27 PM
Deveno
for any set S of vectors lying in a vector space, the span of S, span(S), is the set of all F-linear combinations of elements of S.

explicitly, this is the set of all sums: a1v1 + a2v2 +...+ akvk, where each aj is in the field F, and each vj is a vector in S.

the test for linear independence of such a set is this: one sets an arbitrary linear combination of the vj equal to the 0-vector:

a1v1 + a2v2 +...+ akvk = 0, and the set {v1,v2,...,vk} is said to be linearly independent if and only if:

a1v1 + a2v2 +...+ akvk = 0 forces each and every aj to be 0. in other words, the only linear combination of the vj that is the 0-vector,

is the combination 0v1 + 0v2 +...+ 0vk. if we can find a non-zero linear combination that equals the 0-vector,

the set {v1,v2,...,vk} is linearly dependent.

a subset W is a subspace of a vector space V, if it satisfies all the vector space axioms where vector addition and scalar multiplication

are the same as for V, but restricted to W. since many of the axioms will naturally be inherited by W, to verify W is a subspace,

only 3 conditions need be checked: closure of vector addition on W, closure of scalar multiplication on W, and that W is non-empty

(equivalently, that the 0-vector of V resides in W).

to say that Y:V-->W is a F-linear transformation, means that for all u,v in V, and a,b in F, Y(au + bv) = aY(u) + bY(v).

colloquially, the image of a linear combination of vectors, is a linear combination of images of the vectors.

a matrix is diagonalizable, if it is similar to a matrix that is diagonal. in symbols: PAP^-1 = D. the matrix P can be thought of as a

"change of basis" and D represents the matrix for A in the new basis. if A is diagonalizable, a basis of eigenvectors can be chosen,

and in this basis, the matrix for A is not only diagonal, but the diagonal elements are the eigenvalues of A.

a (real) inner product, must be positive-definite: <u,u> ≥ 0, and <u,u> = 0 if and only if u = 0,

and also be symmetric <u,v> = <v,u> of r all vectors u,v in V. because of this generality, it is possible

to define several different inner products on any given vector space V.
• Apr 20th 2011, 03:42 PM
HallsofIvy
These are all questions asking for the basic definitions, not for any calculations or proofs. The very first thing you should do in each chapter or section is learn the definitions in that section. Otherwise, you are going to have a very difficult time in any course- but it is particularly important in upper level mathematics where the individual words of definitions are often used in proofs or even in calculations!