Let's say f satisfies:
f''(x) + f'(x)g(x) - f(x) = 0
Show that if f is 0 at 2 points, then f is 0 in some interval in between them
How would you start approaching this problem? I tried just trying to do some manipulations with f(x) = f''(x) + f'(x)g(x) and in this case then if f(a) = f''(a) + f'(a)g(a) = 0 and f(b) = f''(b) + f'(b)g(b) = 0 and trying to find some intervals, but I haven't had much luck.
Any help is appreciated, thanks!
That's what I thought at first (and might still think), but according to what is written down it is just simply f.
If it helps, I was told by my math major friend to look at this theorem, which is just a proof of a sign test.
Suppose f''(a) exists. If f has a local minimum at a, then f''(a) >= 0, and vice versa. The proof being that if we suppose we have a local minimum at a and f''(a) >= 0, then by a previous theorem this would indicate a maximum too, which would necesitate f being constant for some interval containing a, which would be a contradiction.
What is g(x)?
There must be a condition for g(x)!
For example, g(x) could not be a function like -2*tan(x)
Because...if it is, we have a solution that f(x) = sin(x), which is not satisfy "if f is 0 at 2 points, then f is 0 in some interval in between them"
Bao Q. Ta