A curious question occurred to me, and I am not familiar enough with real analysis to answer it. Consider a real function defined as follows: for each element , let be some random element from , a function obviously completely discontinuous. What would the value of be? It would make intuitive sense that the area under the curve is equal to the length of the interval, , times the average value of the function in that interval. Therefore the answer to the question would be . But how to prove this rigorously? Is the function even integrable and by what definition?