Assuming the above is correct, I am now going to show that all the eigenvalues are real.
Let
Thus, all eigenvalues are real.
Just commenting on the proof of the realness of the eigenvalues: you could save yourself a bit of effort if you could show that your original differential operator is self-adjoint. In this context, that amounts to showing that the operator is of the Stürm-Liouville type.
I've pored over your post # 2 now, and I think I can finally discern the logic you're employing there. Basically, it comes down to this: , or you get having to be both an odd-integer multiple of , and a multiple of , which can't be. Therefore, . I think your overall logic works, provided that the form of your solution hasn't already assumed that . You have not shown that lambda can't be zero. In order to do that, you have to re-solve the DE with that assumption in mind (the solutions you get for that case are not obtainable with any selection of the integration constants for the lambda not zero case), and show that there are no eigenvectors.
I would probably solve the problem this way: break it up into three cases, according to the dichotomy law: For let For do the obvious. And for let In each case, you get a different form of the solution, with which you work to see if it's an allowed case. For , you get exponentials. For you get straight lines. For you get sinusoids. Remember that, by definition, eigenvectors cannot be identically zero. That fact, in this case, rules out the and cases.
Make sense?
It is not only not easier to show it this way, it is impossible! In even writing down the function at all, you've already assumed that that is the form of the solution when which simply isn't true. Instead, you must re-solve the DE from scratch (it's quite straight-forward, really), and then apply the boundary conditions. There is no other way that I know of to show that is not an eigenvalue.
Like I said in my previous post, as long as you haven't already assumed a form of the solution that is only applicable when then your proof works out fine.When lambda < 0, the term inside the cosine is complex, and I have already shown that complex numbers aren't eigenvalues of the solution.
I would, incidentally, put more English in your proof of post # 2. It's a bit hard to follow what you're doing. Don't write so that you can be understood! Write so that you can't be misunderstood.
From my understand, plugging in lambda = 0 is fine.
The example in my book has:
Then states: We return now to the problem of finding all all the eigenvalues, that is, all the solutions of the equation. If lambda = 0 the left member is to be interpreted as
I used what was obtained as well so I don't see what the difference is besides the example is different.
Well, that method could well be valid: I don't know. To me it seems a bit strange to write down the sin or cosine, which isn't the solution to the case, and then turn around and use that form of the solution to show that is not an eigenvalue.
Here's what I would do: implies and so The condition implies and thus But implies and hence which is not allowed, because eigenfunctions can't be identically zero by definition. Done. Is that not fairly intuitive?
Now you need to separate the PDE
Assume this gives
This gives
Now the equation is what you have already solved
Now solve for and you will have the general form of the solution to your equation.
Using your initial condition gives this
Now use your innerproduct and the orthogonality relationships to solve for the
Now integrate bothsides from 0 to L and see what happens!