Say X is a uniform random var on [-1,1], so on [-1,1]. I want to find the density of over [g(-1),g(1)]=[-1,1]. g is a monotonic function so everything should be kosher. I look up the change of variable formula on wikipedia:

So I do some algebra:

so

thus

But there's a singularity/asymtopes at y=0!... am I doing something wrong? The books I have conveniently choose random vars so there is no divide by zero (for example, if X was on [1,2] or some other interval where g(x) doesn't cross the x-axis, everything works fine).