I'm working on a practice midterm and this question has me a bit stumped. Can anyone please help in working it out? Thank you!!
f(x) = ae^(-ax) with 0<equalto x < infinity
where a is constant, show that the mean and standard deviation are both 1/a
Thank you.