Originally Posted by
uberbandgeek6 Suppose that (X, Y ) is uniformly distributed on a circle with center at the origin and radius. What is P(3X + Y > 1)?
Hint: Sketch the region over which you must integrate. Also, an antiderivative of sqrt(1-y^2) is .5y*sqrt(1-y^2) +.5*arcsin(y)
First I found the pdf of (X,Y) by taking the double integral of a constant c from -sqrt(1-y^2) to sqrt(1-y^2) for dx and from -1 to 1 for dy, and then set it equal to 1 to find c = 1/pi. Then to find P(3X + Y > 1), I took the integral of 1/pi from (1/3)-(1/3)y to sqrt(1-y^2) for dx and from -4/5 to 1 for dy. This gave me an answer of 8.61, which can't possibly be right for a probability. Did I set this problem up incorrectly? Sorry if this explanation is hard to understand. I will clarify it if needed.