I seriously have no idea how to approach this, even after staring at it for an hour.

A high-tech company purchases a new computing system whose intitial value is V. The system will depreciate at the rate f = f(t) where t is the time measured in months. Suppose that

if . If (note: if t>30, f(t) = 0)

Determine the length of time T for the total depreciation to equal the initial value V.

Can somebody please help me get this set up? I'm totally frustrated.

Thanks.(Headbang)