Page 1 of 2 12 LastLast
Results 1 to 15 of 16

Math Help - Every nonzero vector space can be viewed as a space of functions

  1. #1
    Junior Member
    Joined
    Jun 2010
    Posts
    26

    Every nonzero vector space can be viewed as a space of functions

    The problem statement, all variables and given/known data

    Let V be a nonzero vector space over a field F, and suppose that S is a bases for V. Let C(S,F) denote the vector space of all functions f ∈ Ω(S,F) (i.e. the set of functions from S to a field F) such that f(s) = 0 for all but a finite number of vectors in S. Let Ψ: C(S,F) → V be defined by Ψ(f) = 0 if f is the zero function, and Ψ(f) = Σ {s ∈ S, f(s) ≠ 0} f(s)s, otherwise. Prove that Ψ is an isomorphism.

    The attempt at a solution

    Okay, I've already proved that Ψ is linear, that it is 1-1, but I'm having troubles proving that it is onto. Here's what I've done:

    I'd like to be able to show that for any v ∈ V there is a f ∈ Ω(S,F) such that Ψ(f) = v. So v = (a1)s1 + (a2)s2 + ... where S = {s1, s2, ...} is a basis for V (I haven't been told whether V is finite or infinite dimensional). However Ψ(f), when f is nonzero, is a linear combination of FINITE number of elements of the basis. I do realize we could write Ψ(f) = f(s1)s1 + f(s2)s2 + ... where some of these will be zero as f is only nonzero at a finite number of them. See if v = (a1)s1 + a2(s2) + ... + (an)sn then I could define f to be f(si) = ai and I would be done. But the problem is that I can't see and can't show that Ψ(f) could ever equal a vector in v which is a linear combination of an infinite number of elements of the basis.

    Any help? This isn't homework, I'm taking a look at linear algebra on my own this summer. Thanks a lot!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Banned
    Joined
    Oct 2009
    Posts
    4,261
    Thanks
    2
    Quote Originally Posted by buri View Post
    The problem statement, all variables and given/known data

    Let V be a nonzero vector space over a field F, and suppose that S is a bases for V. Let C(S,F) denote the vector space of all functions f ∈ Ω(S,F) (i.e. the set of functions from S to a field F) such that f(s) = 0 for all but a finite number of vectors in S. Let Ψ: C(S,F) → V be defined by Ψ(f) = 0 if f is the zero function, and Ψ(f) = Σ {s ∈ S, f(s) ≠ 0} f(s)s, otherwise. Prove that Ψ is an isomorphism.

    The attempt at a solution

    Okay, I've already proved that Ψ is linear, that it is 1-1, but I'm having troubles proving that it is onto. Here's what I've done:

    I'd like to be able to show that for any v ∈ V there is a f ∈ Ω(S,F) such that Ψ(f) = v. So v = (a1)s1 + (a2)s2 + ... where S = {s1, s2, ...} is a basis for V (I haven't been told whether V is finite or infinite dimensional). However Ψ(f), when f is nonzero, is a linear combination of FINITE number of elements of the basis. I do realize we could write Ψ(f) = f(s1)s1 + f(s2)s2 + ... where some of these will be zero as f is only nonzero at a finite number of them. See if v = (a1)s1 + a2(s2) + ... + (an)sn then I could define f to be f(si) = ai and I would be done. But the problem is that I can't see and can't show that Ψ(f) could ever equal a vector in v which is a linear combination of an infinite number of elements of the basis.

    Any help? This isn't homework, I'm taking a look at linear algebra on my own this summer. Thanks a lot!


    There is no such a thing as "linear combination of an infinite number of elements of the basis": by definition, if S is a base of V and S is infinite, then any element in V is a (unique) lin. comb. of a FINITE number of elements of S.
    This, I think, solves the problem.

    Tonio
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Jun 2010
    Posts
    26
    You're right. I'm following an outline for a course I'll be taking in the fall and they skip the section on "Maximal Linearly Independent Subsets" and in that section it asks the reader to prove what you've just stated. So I hadn't seen it, so I guess this problem was connected to that section. Thanks a lot for the help!!
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Junior Member
    Joined
    Jun 2010
    Posts
    26
    Quote Originally Posted by tonio View Post
    by definition, if S is a base of V and S is infinite, then any element in V is a (unique) lin. comb. of a FINITE number of elements of S.
    Tonio
    Sorry I'm coming back to this, but I don't see intuitively how this can be true. For example, lets say I take the vector space of infinite-tuples so x = (x1, x2, ...). How is it that I can write this as a linear combination of FINITE number of elements of S (a basis of this vector space)? It just seems that I'd require an infinite number of elements of S to do so. Can you explain this to me?
    Follow Math Help Forum on Facebook and Google+

  5. #5
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    In infinite-dimensional vector spaces, there can be functions in the space which require an infinite sum of vectors in the basis. Just think about periodic functions on the interval [0,2\pi], with the Fourier sines and cosines as a basis set. If I take a function as simple as the triangle function, I'm going to need an infinite sum of sines and cosines in order to write the triangle function as a linear combination of sines and cosines. If you consider the continuous functions on the interval in question, this is a vector space.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor

    Joined
    Apr 2005
    Posts
    15,577
    Thanks
    1418
    Quote Originally Posted by buri View Post
    Sorry I'm coming back to this, but I don't see intuitively how this can be true. For example, lets say I take the vector space of infinite-tuples so x = (x1, x2, ...). How is it that I can write this as a linear combination of FINITE number of elements of S (a basis of this vector space)? It just seems that I'd require an infinite number of elements of S to do so. Can you explain this to me?
    Look carefully at what "this" refers to above. They way you use "this", it refers to the vector space, not a single vector. There exist an infinite number of such vectors, a basis, such that any member of the space, that is any vector, can be written as a linear combination of a finite number of them. That is, as tonio said, part of the definition of "basis".
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor

    Joined
    Apr 2005
    Posts
    15,577
    Thanks
    1418
    Quote Originally Posted by Ackbeet View Post
    In infinite-dimensional vector spaces, there can be functions in the space which require an infinite sum of vectors in the basis. Just think about periodic functions on the interval [0,2\pi], with the Fourier sines and cosines as a basis set. If I take a function as simple as the triangle function, I'm going to need an infinite sum of sines and cosines in order to write the triangle function as a linear combination of sines and cosines. If you consider the continuous functions on the interval in question, this is a vector space.
    Akbeet, that's a different matter, a "Hamel basis". In any vector space, even infinite dimensional, there must exist a collection of vectors such that every vector can be written as a linear combination of a finite number of them.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    Ah. I see it's been a little too long since I've studied this stuff. I get tangled up in careful definitions, I suppose. The wiki on Linear Combinations certainly agrees with you.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    I've also been hobnobbing with physicists too long. Here's a quote from Griffith's Introduction to Quantum Mechanics, 1st Ed., p. 102, which I thought you'd find cute. He's talking about the eigenfunctions of the id/dx operator, and the x (multiplication by x) operator.

    Thus

    \displaystyle{f_{\lambda}(x)=\frac{1}{\sqrt{2\pi}}  \,e^{-i\lambda x},\;\text{with}\;\langle f_{\lambda}|f_{\mu}\rangle=\delta(\lambda-\mu),}

    are the "normalized" eigenfunctions of i\hat{D}, and

    g_{\lambda}(x)=\delta(x-\lambda),\;\text{with}\;\langle g_{\lambda}|g_{\mu}\rangle=\delta(\lambda-\mu),

    are the "normalized" eigenfunctions of \hat{x}.
    At this point he has footnote 21:

    We are engaged here in a dangerous stretching of the rules, pioneered by Dirac (who had a kind of inspired confidence that he could get away with it) and disparaged by von Neumann (who was more sensitive to mathematical niceties), in their rival classics... Dirac notation invites us to apply the language and methods of linear algebra to functions that lie in the "almost normalizable" suberbs of Hilbert space. It turns out to be powerful and effective beyond any reasonable expectation.
    Then back to the main text:

    What if we use the "normalized" eigenfunctions of i\hat{D} and \hat{x} as bases for L_{2}?
    Then footnote 22:

    That's right: We're going to use, as bases, sets of functions none of which is actually in the space! They may not be normalizable, but they are complete, and that's all we need.
    So you can see here that some physicists don't give a rip about the distinction between "basis" and "Hamel basis." All they care about is whether they can write a given function as a sum (possibly infinite) of coefficients against a set of "simpler" or "easier to work with" functions, often the eigenfunctions of a Hermitian operator.

    And, if you think about it, in applied mathematics, I'm not sure I can think up an example where you'd actually have a bona fide basis in an infinite-dimensional space. In any example I can think of (which is certainly not all examples), you'd have to go for a Hamel basis.

    Is it true that every vector space has a "basis" as you've described? That would have to be a matter of definition: what about my example? Would you not call it a vector space, then? If you look at the axioms of a vector space, I think my function example would fit the bill.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Junior Member
    Joined
    Jun 2010
    Posts
    26
    Quote Originally Posted by HallsofIvy View Post
    Look carefully at what "this" refers to above. They way you use "this", it refers to the vector space, not a single vector. There exist an infinite number of such vectors, a basis, such that any member of the space, that is any vector, can be written as a linear combination of a finite number of them. That is, as tonio said, part of the definition of "basis".
    Okay, maybe I'm just getting to wrapped up in all this since its by definition of a basis.

    I KNOW that such a basis exists, by the maximal principal, but intuitively I don't see how such a basis exists. How could I possibly write an infinite-tuple as a linear combination of a finite number of elements of the basis?

    Is there a infinite-dimensional vector space for which we know its basis explicitly? Maybe seeing an example can help me understand this better.

    EDIT
    I'm considering the vector spaces of infinite-tuples. x = (x1,x2,...) by the result above this can be written as a finite linear combination of vectors in S (if S is the basis of V). So we'd have something like x = (a1)s1 + (a2)s2 + ... + (an)sn. I guess my confusion arises becuase I tend to think of s1,s2,...sn as the "standard basis" (i.e. (1,0,0...) (0,1,0...) and so on) for which I would need an infinite linear combination to get x. BUT this "standard basis" isn't a basis by the definition. So the basis that will work, maybe has elements that are like s1 = (1, 0, 3/4, 2,...) and other ones with more than one entry so that each entry in x will be "taken care of". Maybe like that only a finite number would be needed.

    See what I mean? Seriously, this is freaky stuff lol
    Last edited by buri; July 23rd 2010 at 07:03 AM. Reason: Another thought I had to put in
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Junior Member
    Joined
    Jun 2010
    Posts
    26
    Quote Originally Posted by Ackbeet View Post
    Is it true that every vector space has a "basis" as you've described? That would have to be a matter of definition: what about my example? Would you not call it a vector space, then? If you look at the axioms of a vector space, I think my function example would fit the bill.
    Yes, EVERY vector space has a basis (like the one described in my original post), but you've defined the "wrong" basis (i.e. one that doesn't fit the definition I've given). However, one such basis DOES exist as a consequence of the maximal principal (or equivalently Zorn's Lemma/Axiom of Choice) we just don't know what it is.
    Follow Math Help Forum on Facebook and Google+

  12. #12
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    So what if you don't believe in the Axiom of Choice?
    Follow Math Help Forum on Facebook and Google+

  13. #13
    Junior Member
    Joined
    Jun 2010
    Posts
    26
    Quote Originally Posted by Ackbeet View Post
    So what if you don't believe in the Axiom of Choice?
    Then I wouldn't know what to say...lol
    Follow Math Help Forum on Facebook and Google+

  14. #14
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    Well, some mathematicians don't believe in the AoC. My real analysis professor in graduate school didn't.

    I have to admit that I'm not terribly interested in the differences between the kinds of bases right now, though I'm not saying they're unimportant.

    I'm not competent to answer your OP, I'm afraid. If you still have questions, see if someone else can help.
    Follow Math Help Forum on Facebook and Google+

  15. #15
    Junior Member
    Joined
    Jun 2010
    Posts
    26
    Thanks anyways though! I've learned a lot nonetheless!
    Follow Math Help Forum on Facebook and Google+

Page 1 of 2 12 LastLast

Similar Math Help Forum Discussions

  1. Replies: 2
    Last Post: July 8th 2011, 02:16 PM
  2. Replies: 2
    Last Post: April 1st 2011, 02:40 AM
  3. Banach space with infinite vector space basis?
    Posted in the Differential Geometry Forum
    Replies: 2
    Last Post: March 24th 2011, 06:23 PM
  4. Vector space of functions
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: October 8th 2010, 06:43 AM
  5. Isomorphism beetwenn vector space and sub space
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: November 30th 2008, 10:05 AM

Search Tags


/mathhelpforum @mathhelpforum