1. ## parseval innequality question..

http://i34.tinypic.com/2u725x3.jpg

i cant see why those thrre are correct

can you give simple example the shows those equations are correct
??

what is the parseval innequality?
and are the roles of thos equation in the parseval innequality?

2. Originally Posted by transgalactic
http://i34.tinypic.com/2u725x3.jpg

i cant see why those thrre are correct

can you give simple example the shows those equations are correct
??

what is the parseval innequality?
and are the roles of thos equation in the parseval innequality?

I'm 99% sure I know what those equations you sent mean, but after so many questions you've asked here I think it's about time to be really more careful when writing mathematics and, very important, when asking questions: write back and explain EXACTLY where/what are you working with (vector space with inner product, orthonormal bases and etc.), what are those (vector?) u_i, those e_i, etc., in particular because you seem to be working with an infinite dimensional linear space, perhaps as preparation for Fourier series or stuff.

Tonio

3. Assuming that you are, in fact, talking about an infinite dimensional vector space and that $\{e_i\}$ is an orthonormal vector space, then those are just extensions of the corresponding notions for finite dimensional vector spaces.

In particular, if $u= \sum_{i= 0}^\infty \alpha_ie_i$, we can determine each of the coefficients, $\alpha_i$, by taking the inner product of u with $e_i$: $= <\sum_{j=1}^\infty \alpha_j e_j, e_i>= \sum_{j=1}^\infty \alpha_j $. But $= 1$ if i= j, 0 otherwise so that sum collapses down to the single value, $= \alpha_i$. That's where line (2) comes from: Since $\alpha_i= $, $u= \sum_{i=0}^\infty \alpha_ie_i= \sum_{i=0}^\infty e_i$.

The norm, in any innerproduct space, is defined by $||u||= \sqrt{}$ so if $u= \sum_{i=1}^\infty e_i$, then $||u||= \sqrt{<\sum_{i=1}^\infty \alpha_iu_i, \sum_{j=1}^\infty \alpha_j u_j>}$ $= \sqrt{\sum_{i=1}^\infty\sum_{j=1}^\infty \alpha_i\alpha_j }$. But, again, since $e_i$ are orthonormal, $$ = 1 if i= j, 0 otherwise, so that sum reduces to $\sqrt{\sum_{i=1}^\infty \alpha_i^2}$ which, as above, is exactly the same as $\sqrt{\sum_{i=1}^\infty ||^2$.

4. Originally Posted by HallsofIvy
Assuming that you are, in fact, talking about an infinite dimensional vector space and that $\{e_i\}$ is an orthonormal vector space, then those are just extensions of the corresponding notions for finite dimensional vector spaces.

In particular, if $u= \sum_{i= 0}^\infty \alpha_ie_i$, we can determine each of the coefficients, $\alpha_i$, by taking the inner product of u with $e_i$: $= <\sum_{j=1}^\infty \alpha_j e_j, e_i>= \sum_{j=1}^\infty \alpha_j $. But $= 1$ if i= j, 0 otherwise so that sum collapses down to the single ualue, $= \alpha_i$. That's where line (2) comes from: Since $\alpha_i= $, $u= \sum_{i=0}^\infty \alpha_ie_i= \sum_{i=0}^\infty e_i$.

The norm, in any innerproduct space, is defined by $||u||= \sqrt{}$ so if $u= \sum_{i=1}^\infty e_i$, then $||u||= \sqrt{<\sum_{i=1}^\infty \alpha_iu_i, \sum{j=1}^\infty \alpha_j u_j>}$ $= \sqrt {\sum_{i=1}^\infty\sum_{j=1}^\infty \alpha_i\alpha_j }$. But since, again, since $e_i$
are orthonormal, $$ = 1 if i= j, 0 otherwise, so that sum reduces to $\sqrt{\sum_{i=1}^\infty \alpha_i^2}$ which, as above, is exactly the same as $\sqrt{\sum_{i=1}^\infty ||^2}$.
i fixed the code now i try to understand each peace