I'm not even sure if that's the proper thing to call it, but here's the question.

A point is uniformly distributed within the disk of radius 1. That is, its density is

f(x,y)=C, 0≤x^2+y^2≤1

Find the probability that its distance from the origin is less than x, 0≤x≤1.

I'm not sure of what I'm supposed to do here. What are my steps?