In triangle $\displaystyle ABC$, the bisector of angle $\displaystyle ACB$ meets side $\displaystyle AB$ in point $\displaystyle D$, while $\displaystyle BC=a, AC=b$ and $\displaystyle CD=d$. Prove that $\displaystyle d<\frac{2ab}{a+b}$.
A key formula here is the area of a triangle involving two adjacent sides and their included angle:
area = (1/2)a*b*sinC -------------(i)
Given: angle BCD = angle ACD.
Let us call any of them as angle theta or t.
area of triangle BCD = A1 = (ad/2)sin(t)
area of triangle ACD = A2 = (bd/2)sin(t)
area of triangle ABC = (A1 +A2) = (ab/2)sin(2t)
So,
(ad/2)sin(t) +(bd/2)sin(t) = (ab/2)sin(2t)
(ad)sin(t) +(bd)sin(t) = (ab)[2sin(t)cos(t)]
ad +bd = 2ab[cos(t)]
d(a+b) = 2ab[cos(t)]
d = [2ab / (a+b)]*[cos(t)]
d / cos(t) = 2ab / (a+b) -------------(ii)
Now, (ii) is an equality when cos(t) divides d, or d is divided by cos(t)....as long as cos(t) is not zero.
cos(t) will be zero only if theta is 90 degrees or pi/2. But in the given problem, theta cannot be pi/2.
If cos(t) were equal to 1.00, then (ii) is an equality.
cos(t) will be 1.00 only if theta is zero or 2pi. Again, in the given problem, theta cannot be zero nor 2pi.
So the cos(t) in (ii) will always be a value that is less than 1.00
Therefore, d < 2a/(a+b) always. .........edited
Proven.
------------
EDIT:
I mistakenly wrote "d > 2a/(a+b) always."
It should be less than. Because the 1/cos(t) will made d bigger. Because d is really smaller, that's why the 1/cos(t) will make d bigger to equal the RHS. So if the 1/cos(t) is not there, then d is less than the RHS.