For two random variables, the "distance" between two distributions is defined to be the maximum, $\displaystyle \text{ max all x}|F_{1}(x)-F_{2}(x)|$ over the range which $\displaystyle F_1$ and $\displaystyle F_2$ are defined, where $\displaystyle f(x)$ is the cumulative distribution function. Find the distance between the following two distributions:
$\displaystyle \text{(i) uniform on the interval }[0,1]$
$\displaystyle \text{(ii) pdf is }f(x)=\frac{1}{(x+1)^2}\text{ for }0<x<\infty$
So:
$\displaystyle F_1=x$
$\displaystyle F_2=1-\frac{1}{1+x}$
$\displaystyle |F_1(x)-F_2(x)|=\Big{|}x-1+\frac{1}{1+x}\Big{|}$
From this point I'm not sure of what to do. Any help?