1 dimensional heat loss equation (boundary-value problem)

Consider the boundary-value problem

-u'' = f on (x_l, x_r)

u' = A_l at x = x_l

-u' = A_r at x = x_r

which models a steady, 1-dimensional temperature distribution, with u(x) the temperature at x. The function f(x) gives the density of heat sources, A_l and A_r are the heat fluxes leaving the region (x_l, x_r) at the boundaries x = x_l and x = x_r, respectively.

Show that if u_1 and u_2 are solutions, then u_1 - u_2 must be constant.

I can see why this must be true but can anyone give me some ideas about how to prove it?