Page 1 of 2 12 LastLast
Results 1 to 15 of 25

Math Help - Eigenvalue problem

  1. #1
    MHF Contributor
    Joined
    Mar 2010
    From
    Florida
    Posts
    3,093
    Thanks
    5

    Eigenvalue problem

    \varphi''+\lambda\varphi=0
    \varphi'(0)=0
    \varphi(L)=0

    m^2+\lambda=0\Rightarrow \exp(\pm xi\sqrt{\lambda})

    \varphi=C_1\cos(x\sqrt{\lambda})+C_2\sin(x\sqrt{\l  ambda})

    \varphi_1(0): \ C_1=1
    \varphi_1'(0): \ C_2=0

    \varphi_1=\cos(x\sqrt{\lambda})

    \varphi_2(0): \ C_1=0
    \displaystyle\varphi_2'(0): \ C_2=\frac{1}{\sqrt{\lambda}}

    \displaystyle\varphi_2=\frac{\sin(x\sqrt{\lambda})  }{\sqrt{\lambda}}

    \displaystyle\varphi(x)=A\cos(x\sqrt{\lambda})+B\f  rac{\sin(x\sqrt{\lambda})}{\sqrt{\lambda}}

    \varphi'(0): \ B=0

    \varphi(L): \ A\cos(L\sqrt{\lambda})=0

    \displaystyle\lambda_n=\left(\frac{(2n+1)\pi}{2L}\  right)^2, \ \ n\in\mathbb{Z}

    \displaystyle\varphi_n(x)=A_n\cos\left(\frac{(2n+1  )\pi x}{2L}\right)

    \displaystyle f(x)=\sum_{n=0}^{\infty}A_n\cos\left(\frac{(2n+1)\  pi x}{2L}\right)

    Is this correct so far?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor
    Joined
    Mar 2010
    From
    Florida
    Posts
    3,093
    Thanks
    5
    Assuming the above is correct, I am now going to show that all the eigenvalues are real.

    \cos(L\sqrt{\lambda})=0

    Let L\sqrt{\lambda}=a+bi, \ \ a,b\in\mathbb{R}

    \cos(a+bi)=\cos(a)\cos(bi)-\sin(a)\sin(bi)=\cos(a)\cosh(b)-i\sin(a)\sinh(b)=0

    \displaystyle\cos(a)\cosh(b)=0, \ \ \cosh(b)>0, \ \ \cos(a)=0\Rightarrow a=\frac{\pi}{2}+\pi k, \ \ k\in\mathbb{Z}
    -\sin(a)\sinh(b)=0, \ \ \sinh(b)=0\Rightarrow b=0, \ \ \sin(a)=0\Rightarrow a=\pi k

    \cos(a+bi)\neq 0

    Thus, all eigenvalues are real.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    Just commenting on the proof of the realness of the eigenvalues: you could save yourself a bit of effort if you could show that your original differential operator is self-adjoint. In this context, that amounts to showing that the operator is of the Stürm-Liouville type.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor
    Joined
    Mar 2010
    From
    Florida
    Posts
    3,093
    Thanks
    5
    Quote Originally Posted by Ackbeet View Post
    Just commenting on the proof of the realness of the eigenvalues: you could save yourself a bit of effort if you could show that your original differential operator is self-adjoint. In this context, that amounts to showing that the operator is of the Stürm-Liouville type.
    I haven't arrived to that in the book yet. Is post 1 correct though?
    Follow Math Help Forum on Facebook and Google+

  5. #5
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    Post 1 looks correct so far as it goes. You should probably, if you haven't yet, go ahead and show that there are no eigenvalues for either the \lambda<0 or \lambda=0 case.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor
    Joined
    Mar 2010
    From
    Florida
    Posts
    3,093
    Thanks
    5
    Quote Originally Posted by Ackbeet View Post
    Post 1 looks correct so far as it goes. You should probably, if you haven't yet, go ahead and show that there are no eigenvalues for either the \lambda<0 or \lambda=0 case.
    Doesn't the fact that cosine can't be of form cos(a + bi) show that lambda can't be less than 0?
    Follow Math Help Forum on Facebook and Google+

  7. #7
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    I've pored over your post # 2 now, and I think I can finally discern the logic you're employing there. Basically, it comes down to this: b = 0, or you get a having to be both an odd-integer multiple of \pi/2, and a multiple of \pi, which can't be. Therefore, b=0. I think your overall logic works, provided that the form of your solution hasn't already assumed that \lambda>0.. You have not shown that lambda can't be zero. In order to do that, you have to re-solve the DE with that assumption in mind (the solutions you get for that case are not obtainable with any selection of the integration constants for the lambda not zero case), and show that there are no eigenvectors.

    I would probably solve the problem this way: break it up into three cases, according to the dichotomy law: \lambda<0,\;\lambda=0,\;\lambda>0. For \lambda<0, let \lambda=-\alpha^{2}. For \lambda=0, do the obvious. And for \lambda>0, let \lambda=\alpha^{2}. In each case, you get a different form of the solution, with which you work to see if it's an allowed case. For \lambda<0, you get exponentials. For \lambda=0, you get straight lines. For \lambda>0, you get sinusoids. Remember that, by definition, eigenvectors cannot be identically zero. That fact, in this case, rules out the \lambda<0 and \lambda=0 cases.

    Make sense?
    Follow Math Help Forum on Facebook and Google+

  8. #8
    MHF Contributor
    Joined
    Mar 2010
    From
    Florida
    Posts
    3,093
    Thanks
    5
    Quote Originally Posted by Ackbeet View Post
    I've pored over your post # 2 now, and I think I can finally discern the logic you're employing there. Basically, it comes down to this: b = 0, or you get a having to be both an odd-integer multiple of \pi/2, and a multiple of \pi, which can't be. Therefore, b=0. I think your overall logic works, provided that the form of your solution hasn't already assumed that \lambda>0.. You have not shown that lambda can't be zero. In order to do that, you have to re-solve the DE with that assumption in mind (the solutions you get for that case are not obtainable with any selection of the integration constants for the lambda not zero case), and show that there are no eigenvectors.

    I would probably solve the problem this way: break it up into three cases, according to the dichotomy law: \lambda<0,\;\lambda=0,\;\lambda>0. For \lambda<0, let \lambda=-\alpha^{2}. For \lambda=0, do the obvious. And for \lambda>0, let \lambda=\alpha^{2}. In each case, you get a different form of the solution, with which you work to see if it's an allowed case. For \lambda<0, you get exponentials. For \lambda=0, you get straight lines. For \lambda>0, you get sinusoids. Remember that, by definition, eigenvectors cannot be identically zero. That fact, in this case, rules out the \lambda<0 and \lambda=0 cases.

    Make sense?
    For lambda = 0, wouldn't it be easier to just show:

    \cos(L\sqrt{\lambda})=0\Rightarrow 1=0 which is never true.

    When lambda < 0, the term inside the cosine is complex, and I have already shown that complex numbers aren't eigenvalues of the solution.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    Quote Originally Posted by dwsmith View Post
    For lambda = 0, wouldn't it be easier to just show:

    \cos(L\sqrt{\lambda})=0\Rightarrow 1=0 which is never true.
    It is not only not easier to show it this way, it is impossible! In even writing down the \cos function at all, you've already assumed that that is the form of the solution when \lambda=0, which simply isn't true. Instead, you must re-solve the DE from scratch (it's quite straight-forward, really), and then apply the boundary conditions. There is no other way that I know of to show that \lambda=0 is not an eigenvalue.

    When lambda < 0, the term inside the cosine is complex, and I have already shown that complex numbers aren't eigenvalues of the solution.
    Like I said in my previous post, as long as you haven't already assumed a form of the solution that is only applicable when \lambda>0, then your proof works out fine.

    I would, incidentally, put more English in your proof of post # 2. It's a bit hard to follow what you're doing. Don't write so that you can be understood! Write so that you can't be misunderstood.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    MHF Contributor
    Joined
    Mar 2010
    From
    Florida
    Posts
    3,093
    Thanks
    5
    Quote Originally Posted by Ackbeet View Post
    It is not only not easier to show it this way, it is impossible! In even writing down the \cos function at all, you've already assumed that that is the form of the solution when \lambda=0, which simply isn't true. Instead, you must re-solve the DE from scratch (it's quite straight-forward, really), and then apply the boundary conditions. There is no other way that I know of to show that \lambda=0 is not an eigenvalue.



    Like I said in my previous post, as long as you haven't already assumed a form of the solution that is only applicable when \lambda>0, then your proof works out fine.

    I would, incidentally, put more English in your proof of post # 2. It's a bit hard to follow what you're doing. Don't write so that you can be understood! Write so that you can't be misunderstood.
    From my understand, plugging in lambda = 0 is fine.

    The example in my book has:

    \displaystyle \varphi(L)=\frac{\sin(L\sqrt{\lambda})}{\sqrt{\lam  bda}}

    \displaystyle \frac{\sin(L\sqrt{\lambda})}{\sqrt{\lambda}}=0

    Then states: We return now to the problem of finding all all the eigenvalues, that is, all the solutions of the equation. If lambda = 0 the left member is to be interpreted as

    \displaystyle\lim_{\lambda\to 0}\frac{\sin(L\sqrt{\lambda})}{\sqrt{\lambda}}=L\n  eq 0

    I used what was obtained \varphi(L) as well so I don't see what the difference is besides the example is different.
    Follow Math Help Forum on Facebook and Google+

  11. #11
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    Well, that method could well be valid: I don't know. To me it seems a bit strange to write down the sin or cosine, which isn't the solution to the \lambda=0 case, and then turn around and use that form of the solution to show that \lambda=0 is not an eigenvalue.

    Here's what I would do: \lambda=0 implies \varphi''=0, and so \varphi(x)=mx+b. The \varphi'(L)=0 condition implies m=0, and thus \varphi(x)=b. But \varphi(0)=0 implies b=0, and hence \varphi(x)=0, which is not allowed, because eigenfunctions can't be identically zero by definition. Done. Is that not fairly intuitive?
    Follow Math Help Forum on Facebook and Google+

  12. #12
    MHF Contributor
    Joined
    Mar 2010
    From
    Florida
    Posts
    3,093
    Thanks
    5
    Now, I am asked to show that eigenfunctions are orthogonal.

    \displaystyle\int_0^L\varphi_n(x)\varphi_m(x) \ dx=0, \ \ m\neq n

    \displaystyle\int_0^L(\varphi_n(x))^2 \ dx>0

    \displaystyle\int_0^L\cos\left(\frac{(2n+1)\pi x}{2L}\right)\cdot\cos\left(\frac{(2m+1)\pi x}{2L}\right) \ dx=0, \ \forall m\neq n

    \displaystyle \int_0^L\left[\cos\left(\frac{(2n+1)\pi x}{2L}\right)\right]^2 \ dx=\frac{L}{2}

    To save space, I haven't shown the steps for orthogonality but it does hold.

    Now, I am supposed to use everything in this thread to solve:

    \text{D.E.}=u_t=ku_{xx}, \ \ \ t>0, \ \ \ 0<x<L
    \displaystyle\text{B.C.}=\begin{cases} u_x(0,t)=0\\u(L,t)=0\end{cases}, \ \ \ t>0
    u(x,0)=L-x, \ \ \ 0<x<L

    Not sure where to begin.
    Follow Math Help Forum on Facebook and Google+

  13. #13
    Behold, the power of SARDINES!
    TheEmptySet's Avatar
    Joined
    Feb 2008
    From
    Yuma, AZ, USA
    Posts
    3,764
    Thanks
    78
    Quote Originally Posted by dwsmith View Post
    Now, I am asked to show that eigenfunctions are orthogonal.

    \displaystyle\int_0^L\varphi_n(x)\varphi_m(x) \ dx=0, \ \ m\neq n

    \displaystyle\int_0^L(\varphi_n(x))^2 \ dx>0

    \displaystyle\int_0^L\cos\left(\frac{(2n+1)\pi x}{2L}\right)\cdot\cos\left(\frac{(2m+1)\pi x}{2L}\right) \ dx=0, \ \forall m\neq n

    \displaystyle \int_0^L\left[\cos\left(\frac{(2n+1)\pi x}{2L}\right)\right]^2 \ dx=\frac{L}{2}

    To save space, I haven't shown the steps for orthogonality but it does hold.

    Now, I am supposed to use everything in this thread to solve:

    \text{D.E.}=u_t=ku_{xx}, \ \ \ t>0, \ \ \ 0<x<L
    \displaystyle\text{B.C.}=\begin{cases} u_x(0,t)=0\\u(L,t)=0\end{cases}, \ \ \ t>0
    u(x,0)=L-x, \ \ \ 0<x<L

    Not sure where to begin.
    Now you need to separate the PDE

    Assume u(x,t)=T(t)X(x) this gives

    u_{t}=\dot{T}X \text{ and } u_{xx}=TX''

    This gives

    \displaystyle \dot{T}X=TX'' \iff \frac{\dot{T}}{T}=\frac{X''}{X}=-\lambda

    Now the X equation is what you have already solved

    X''-\lambda X=0

    Now solve for T(t)=e^{-\lambda t} and you will have the general form of the solution to your equation.

    Using your initial condition gives this



    \displaystyle u(x,0)=\sum_{n=0}^{\infty}a_ne^{-n(0)}\varphi_n(x)=L-x

    Now use your innerproduct and the orthogonality relationships to solve for the a_n

    \varphi_m(x)u(x,0)=\varphi_{m}(x)(L-x)=\sum_{n=0}^{\infty}a_n\varphi_m(x)\varphi_n(x)

    Now integrate bothsides from 0 to L and see what happens!
    Follow Math Help Forum on Facebook and Google+

  14. #14
    MHF Contributor
    Joined
    Mar 2010
    From
    Florida
    Posts
    3,093
    Thanks
    5
    Quote Originally Posted by TheEmptySet View Post
    Now you need to separate the PDE

    Assume u(x,t)=T(t)X(x) this gives

    u_{t}=\dot{T}X \text{ and } u_{xx}=TX''

    This gives

    \displaystyle \dot{T}X=TX'' \iff \frac{\dot{T}}{T}=\frac{X''}{X}=-\lambda

    Now the X equation is what you have already solved

    X''-\lambda X=0

    Now solve for T(t)=e^{-\lambda t} and you will have the general form of the solution to your equation.

    Using your initial condition gives this



    \displaystyle u(x,0)=\sum_{n=0}^{\infty}a_ne^{-n(0)}\varphi_n(x)=L-x

    Now use your innerproduct and the orthogonality relationships to solve for the a_n

    \varphi_m(x)u(x,0)=\varphi_{m}(x)(L-x)=\sum_{n=0}^{\infty}a_n\varphi_m(x)\varphi_n(x)

    Now integrate bothsides from 0 to L and see what happens!
    What about the k?

    \displaystyle u_t=ku_{xx}\Rightarrow\frac{\dot{T}}{T}=k\frac{X''  }{X}=-\lambda\text{?}
    Follow Math Help Forum on Facebook and Google+

  15. #15
    Behold, the power of SARDINES!
    TheEmptySet's Avatar
    Joined
    Feb 2008
    From
    Yuma, AZ, USA
    Posts
    3,764
    Thanks
    78
    Quote Originally Posted by dwsmith View Post
    What about the k?

    \displaystyle u_t=ku_{xx}\Rightarrow\frac{\dot{T}}{T}=k\frac{X''  }{X}=-\lambda\text{?}
    I didn't see your k

    I would separate like this

    \displaystyle \frac{\dot{T}}{kT}=\frac{X''}{X}=-\lambda

    This won't change the X equation and

    T(t)=e^{-k\lambda t} and just keep going
    Follow Math Help Forum on Facebook and Google+

Page 1 of 2 12 LastLast

Similar Math Help Forum Discussions

  1. Eigenvalue Problem
    Posted in the Advanced Algebra Forum
    Replies: 10
    Last Post: July 5th 2011, 01:35 AM
  2. eigenvalue problem
    Posted in the Differential Equations Forum
    Replies: 13
    Last Post: January 3rd 2011, 08:39 AM
  3. Eigenvalue problem
    Posted in the Advanced Algebra Forum
    Replies: 3
    Last Post: April 4th 2010, 07:52 PM
  4. eigenvalue problem
    Posted in the Calculus Forum
    Replies: 0
    Last Post: May 14th 2009, 10:10 AM
  5. eigenvalue problem
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: March 10th 2007, 06:29 AM

/mathhelpforum @mathhelpforum