In a small metropolitan area annual losses due to storm, fire and
theft are independently distributed random variables. The probability density functions are:

f(x) = 0.48*exp(-0.48x)

f(x) = 0.14*exp(-0.14x)

f(x) = 0.75*exp(-0.75x)

Determine the probability that the minimum loss does not exceed 0.5.


My strategy would be to obtain the joint density function by multiplying each density function together (by independence). Then integrate from 0 to 0.5 for each variable. Thoughts?