# Thread: Eigenvalue problem

1. ## Eigenvalue problem

$\varphi''+\lambda\varphi=0$
$\varphi'(0)=0$
$\varphi(L)=0$

$m^2+\lambda=0\Rightarrow \exp(\pm xi\sqrt{\lambda})$

$\varphi=C_1\cos(x\sqrt{\lambda})+C_2\sin(x\sqrt{\l ambda})$

$\varphi_1(0): \ C_1=1$
$\varphi_1'(0): \ C_2=0$

$\varphi_1=\cos(x\sqrt{\lambda})$

$\varphi_2(0): \ C_1=0$
$\displaystyle\varphi_2'(0): \ C_2=\frac{1}{\sqrt{\lambda}}$

$\displaystyle\varphi_2=\frac{\sin(x\sqrt{\lambda}) }{\sqrt{\lambda}}$

$\displaystyle\varphi(x)=A\cos(x\sqrt{\lambda})+B\f rac{\sin(x\sqrt{\lambda})}{\sqrt{\lambda}}$

$\varphi'(0): \ B=0$

$\varphi(L): \ A\cos(L\sqrt{\lambda})=0$

$\displaystyle\lambda_n=\left(\frac{(2n+1)\pi}{2L}\ right)^2, \ \ n\in\mathbb{Z}$

$\displaystyle\varphi_n(x)=A_n\cos\left(\frac{(2n+1 )\pi x}{2L}\right)$

$\displaystyle f(x)=\sum_{n=0}^{\infty}A_n\cos\left(\frac{(2n+1)\ pi x}{2L}\right)$

Is this correct so far?

2. Assuming the above is correct, I am now going to show that all the eigenvalues are real.

$\cos(L\sqrt{\lambda})=0$

Let $L\sqrt{\lambda}=a+bi, \ \ a,b\in\mathbb{R}$

$\cos(a+bi)=\cos(a)\cos(bi)-\sin(a)\sin(bi)=\cos(a)\cosh(b)-i\sin(a)\sinh(b)=0$

$\displaystyle\cos(a)\cosh(b)=0, \ \ \cosh(b)>0, \ \ \cos(a)=0\Rightarrow a=\frac{\pi}{2}+\pi k, \ \ k\in\mathbb{Z}$
$-\sin(a)\sinh(b)=0, \ \ \sinh(b)=0\Rightarrow b=0, \ \ \sin(a)=0\Rightarrow a=\pi k$

$\cos(a+bi)\neq 0$

Thus, all eigenvalues are real.

3. Just commenting on the proof of the realness of the eigenvalues: you could save yourself a bit of effort if you could show that your original differential operator is self-adjoint. In this context, that amounts to showing that the operator is of the Stürm-Liouville type.

4. Originally Posted by Ackbeet
Just commenting on the proof of the realness of the eigenvalues: you could save yourself a bit of effort if you could show that your original differential operator is self-adjoint. In this context, that amounts to showing that the operator is of the Stürm-Liouville type.
I haven't arrived to that in the book yet. Is post 1 correct though?

5. Post 1 looks correct so far as it goes. You should probably, if you haven't yet, go ahead and show that there are no eigenvalues for either the $\lambda<0$ or $\lambda=0$ case.

6. Originally Posted by Ackbeet
Post 1 looks correct so far as it goes. You should probably, if you haven't yet, go ahead and show that there are no eigenvalues for either the $\lambda<0$ or $\lambda=0$ case.
Doesn't the fact that cosine can't be of form cos(a + bi) show that lambda can't be less than 0?

7. I've pored over your post # 2 now, and I think I can finally discern the logic you're employing there. Basically, it comes down to this: $b = 0$, or you get $a$ having to be both an odd-integer multiple of $\pi/2$, and a multiple of $\pi$, which can't be. Therefore, $b=0$. I think your overall logic works, provided that the form of your solution hasn't already assumed that $\lambda>0.$. You have not shown that lambda can't be zero. In order to do that, you have to re-solve the DE with that assumption in mind (the solutions you get for that case are not obtainable with any selection of the integration constants for the lambda not zero case), and show that there are no eigenvectors.

I would probably solve the problem this way: break it up into three cases, according to the dichotomy law: $\lambda<0,\;\lambda=0,\;\lambda>0.$ For $\lambda<0,$ let $\lambda=-\alpha^{2}.$ For $\lambda=0,$ do the obvious. And for $\lambda>0,$ let $\lambda=\alpha^{2}.$ In each case, you get a different form of the solution, with which you work to see if it's an allowed case. For $\lambda<0$, you get exponentials. For $\lambda=0,$ you get straight lines. For $\lambda>0,$ you get sinusoids. Remember that, by definition, eigenvectors cannot be identically zero. That fact, in this case, rules out the $\lambda<0$ and $\lambda=0$ cases.

Make sense?

8. Originally Posted by Ackbeet
I've pored over your post # 2 now, and I think I can finally discern the logic you're employing there. Basically, it comes down to this: $b = 0$, or you get $a$ having to be both an odd-integer multiple of $\pi/2$, and a multiple of $\pi$, which can't be. Therefore, $b=0$. I think your overall logic works, provided that the form of your solution hasn't already assumed that $\lambda>0.$. You have not shown that lambda can't be zero. In order to do that, you have to re-solve the DE with that assumption in mind (the solutions you get for that case are not obtainable with any selection of the integration constants for the lambda not zero case), and show that there are no eigenvectors.

I would probably solve the problem this way: break it up into three cases, according to the dichotomy law: $\lambda<0,\;\lambda=0,\;\lambda>0.$ For $\lambda<0,$ let $\lambda=-\alpha^{2}.$ For $\lambda=0,$ do the obvious. And for $\lambda>0,$ let $\lambda=\alpha^{2}.$ In each case, you get a different form of the solution, with which you work to see if it's an allowed case. For $\lambda<0$, you get exponentials. For $\lambda=0,$ you get straight lines. For $\lambda>0,$ you get sinusoids. Remember that, by definition, eigenvectors cannot be identically zero. That fact, in this case, rules out the $\lambda<0$ and $\lambda=0$ cases.

Make sense?
For lambda = 0, wouldn't it be easier to just show:

$\cos(L\sqrt{\lambda})=0\Rightarrow 1=0$ which is never true.

When lambda < 0, the term inside the cosine is complex, and I have already shown that complex numbers aren't eigenvalues of the solution.

9. Originally Posted by dwsmith
For lambda = 0, wouldn't it be easier to just show:

$\cos(L\sqrt{\lambda})=0\Rightarrow 1=0$ which is never true.
It is not only not easier to show it this way, it is impossible! In even writing down the $\cos$ function at all, you've already assumed that that is the form of the solution when $\lambda=0,$ which simply isn't true. Instead, you must re-solve the DE from scratch (it's quite straight-forward, really), and then apply the boundary conditions. There is no other way that I know of to show that $\lambda=0$ is not an eigenvalue.

When lambda < 0, the term inside the cosine is complex, and I have already shown that complex numbers aren't eigenvalues of the solution.
Like I said in my previous post, as long as you haven't already assumed a form of the solution that is only applicable when $\lambda>0,$ then your proof works out fine.

I would, incidentally, put more English in your proof of post # 2. It's a bit hard to follow what you're doing. Don't write so that you can be understood! Write so that you can't be misunderstood.

10. Originally Posted by Ackbeet
It is not only not easier to show it this way, it is impossible! In even writing down the $\cos$ function at all, you've already assumed that that is the form of the solution when $\lambda=0,$ which simply isn't true. Instead, you must re-solve the DE from scratch (it's quite straight-forward, really), and then apply the boundary conditions. There is no other way that I know of to show that $\lambda=0$ is not an eigenvalue.

Like I said in my previous post, as long as you haven't already assumed a form of the solution that is only applicable when $\lambda>0,$ then your proof works out fine.

I would, incidentally, put more English in your proof of post # 2. It's a bit hard to follow what you're doing. Don't write so that you can be understood! Write so that you can't be misunderstood.
From my understand, plugging in lambda = 0 is fine.

The example in my book has:

$\displaystyle \varphi(L)=\frac{\sin(L\sqrt{\lambda})}{\sqrt{\lam bda}}$

$\displaystyle \frac{\sin(L\sqrt{\lambda})}{\sqrt{\lambda}}=0$

Then states: We return now to the problem of finding all all the eigenvalues, that is, all the solutions of the equation. If lambda = 0 the left member is to be interpreted as

$\displaystyle\lim_{\lambda\to 0}\frac{\sin(L\sqrt{\lambda})}{\sqrt{\lambda}}=L\n eq 0$

I used what was obtained $\varphi(L)$ as well so I don't see what the difference is besides the example is different.

11. Well, that method could well be valid: I don't know. To me it seems a bit strange to write down the sin or cosine, which isn't the solution to the $\lambda=0$ case, and then turn around and use that form of the solution to show that $\lambda=0$ is not an eigenvalue.

Here's what I would do: $\lambda=0$ implies $\varphi''=0,$ and so $\varphi(x)=mx+b.$ The $\varphi'(L)=0$ condition implies $m=0,$ and thus $\varphi(x)=b.$ But $\varphi(0)=0$ implies $b=0,$ and hence $\varphi(x)=0,$ which is not allowed, because eigenfunctions can't be identically zero by definition. Done. Is that not fairly intuitive?

12. Now, I am asked to show that eigenfunctions are orthogonal.

$\displaystyle\int_0^L\varphi_n(x)\varphi_m(x) \ dx=0, \ \ m\neq n$

$\displaystyle\int_0^L(\varphi_n(x))^2 \ dx>0$

$\displaystyle\int_0^L\cos\left(\frac{(2n+1)\pi x}{2L}\right)\cdot\cos\left(\frac{(2m+1)\pi x}{2L}\right) \ dx=0, \ \forall m\neq n$

$\displaystyle \int_0^L\left[\cos\left(\frac{(2n+1)\pi x}{2L}\right)\right]^2 \ dx=\frac{L}{2}$

To save space, I haven't shown the steps for orthogonality but it does hold.

Now, I am supposed to use everything in this thread to solve:

$\text{D.E.}=u_t=ku_{xx}, \ \ \ t>0, \ \ \ 0
$\displaystyle\text{B.C.}=\begin{cases} u_x(0,t)=0\\u(L,t)=0\end{cases}, \ \ \ t>0$
$u(x,0)=L-x, \ \ \ 0

Not sure where to begin.

13. Originally Posted by dwsmith
Now, I am asked to show that eigenfunctions are orthogonal.

$\displaystyle\int_0^L\varphi_n(x)\varphi_m(x) \ dx=0, \ \ m\neq n$

$\displaystyle\int_0^L(\varphi_n(x))^2 \ dx>0$

$\displaystyle\int_0^L\cos\left(\frac{(2n+1)\pi x}{2L}\right)\cdot\cos\left(\frac{(2m+1)\pi x}{2L}\right) \ dx=0, \ \forall m\neq n$

$\displaystyle \int_0^L\left[\cos\left(\frac{(2n+1)\pi x}{2L}\right)\right]^2 \ dx=\frac{L}{2}$

To save space, I haven't shown the steps for orthogonality but it does hold.

Now, I am supposed to use everything in this thread to solve:

$\text{D.E.}=u_t=ku_{xx}, \ \ \ t>0, \ \ \ 0
$\displaystyle\text{B.C.}=\begin{cases} u_x(0,t)=0\\u(L,t)=0\end{cases}, \ \ \ t>0$
$u(x,0)=L-x, \ \ \ 0

Not sure where to begin.
Now you need to separate the PDE

Assume $u(x,t)=T(t)X(x)$ this gives

$u_{t}=\dot{T}X \text{ and } u_{xx}=TX''$

This gives

$\displaystyle \dot{T}X=TX'' \iff \frac{\dot{T}}{T}=\frac{X''}{X}=-\lambda$

Now the $X$ equation is what you have already solved

$X''-\lambda X=0$

Now solve for $T(t)=e^{-\lambda t}$ and you will have the general form of the solution to your equation.

Using your initial condition gives this

$\displaystyle u(x,0)=\sum_{n=0}^{\infty}a_ne^{-n(0)}\varphi_n(x)=L-x$

Now use your innerproduct and the orthogonality relationships to solve for the $a_n$

$\varphi_m(x)u(x,0)=\varphi_{m}(x)(L-x)=\sum_{n=0}^{\infty}a_n\varphi_m(x)\varphi_n(x)$

Now integrate bothsides from 0 to L and see what happens!

14. Originally Posted by TheEmptySet
Now you need to separate the PDE

Assume $u(x,t)=T(t)X(x)$ this gives

$u_{t}=\dot{T}X \text{ and } u_{xx}=TX''$

This gives

$\displaystyle \dot{T}X=TX'' \iff \frac{\dot{T}}{T}=\frac{X''}{X}=-\lambda$

Now the $X$ equation is what you have already solved

$X''-\lambda X=0$

Now solve for $T(t)=e^{-\lambda t}$ and you will have the general form of the solution to your equation.

Using your initial condition gives this

$\displaystyle u(x,0)=\sum_{n=0}^{\infty}a_ne^{-n(0)}\varphi_n(x)=L-x$

Now use your innerproduct and the orthogonality relationships to solve for the $a_n$

$\varphi_m(x)u(x,0)=\varphi_{m}(x)(L-x)=\sum_{n=0}^{\infty}a_n\varphi_m(x)\varphi_n(x)$

Now integrate bothsides from 0 to L and see what happens!
What about the k?

$\displaystyle u_t=ku_{xx}\Rightarrow\frac{\dot{T}}{T}=k\frac{X'' }{X}=-\lambda\text{?}$

15. Originally Posted by dwsmith
What about the k?

$\displaystyle u_t=ku_{xx}\Rightarrow\frac{\dot{T}}{T}=k\frac{X'' }{X}=-\lambda\text{?}$
I didn't see your k

I would separate like this

$\displaystyle \frac{\dot{T}}{kT}=\frac{X''}{X}=-\lambda$

This won't change the X equation and

$T(t)=e^{-k\lambda t}$ and just keep going

Page 1 of 2 12 Last