I'm trying to find an answer to the following problem. Any help will be greatly appreciated.
I have a normally-distributed variable X with a mean of 1000 and a some variance (let's assume 1).
I choose a range of 970-990 (for the example's sake).
For any X above 990, I receive 20 dollars.
For any X below 970, I receive 0 dollars.
For X between 970 and 990, I receive: X - 970
My question is: what is the function for calculating the mean profit of this example?
There's one thing I still have a problem with.
The Pr(970 < x < 990) is the chance of X falling anywhere within that range.
But the amount of money received is depending on where withing that range X falls: If X = 985 I receive 15 dollars, while if X = 972 I receive 2 dollars.
I don't see how the function you gave relates to that fact.