# Thread: using Mean Value Theorem to solve an inequality

1. ## using Mean Value Theorem to solve an inequality

I need to solve the following inequality using the MVT, not sure how to go about doing this:
$a^{t}b^{1-t}\leq ta+(1-t)b$ for $t\in [0,1], a>0,b>0$

2. Originally Posted by dannyboycurtis
I need to solve the following inequality using the MVT, not sure how to go about doing this:
$a^{t}b^{1-t}\leq ta+(1-t)b$ for $t\in [0,1], a>0,b>0$
Where have you gotten stuck?

3. Im not sure where to start

4. Originally Posted by dannyboycurtis
Im not sure where to start
Most of these problems are solved by finding the correct interval $[a,b]$ noting that $f(b)-f(a)=(b-a)f'(c)$ for some $c\in(a,b)$ and noticing that $f'(c)$ is either increasing or decreasing on $(a,b)$ so that $(b-a)f'(a)\le f(b)-f(a)\le f'(b)(b-a)$ or the decreasing analogue. Is thist one of those?

5. I am still thouroughly confused as to how to solve this one, all I have is that I should use the function f(x)=tx-x^t, but this hasnt really shown any light for me