I am having trouble with a homework problem from Calc 3. I am not sure how to approach it. In is in the section for line integrals, but the fundamental thm for line integrals has not been introduced yet. This whole chapter has been diffult at times to understand what is being done with the integrals.

If C is a smooth curve given by a vector function r(t), a≤t≤b, and v is a constant vector, show

that

Integral v.dr = v.[r(b)-r(a)]

The only thing I can think of is to take the integral of v(r(t)).r'(t) dt evaluated from a to b then that equals Integral of V. T ds but I have no idea how to get it to equal v.(r(b)-r(a))