Originally Posted by

**Adammarchese** Dear match Community,

I am struggling with a math problem used to calculate the time I can run a generator at certain power output intervals. There are essentially 3 power output values (kwh) at 3 periods of time. In the attached graph, we know that the total duration of time to run the generator is 10 hours. We also know that the first 4 hours we must run the generator at prime power of 10. The next step would be to drop the power output to a continuous rating of 8 (at T1) for an X period of time and then drop to an X (low) power output for the duration of the 10 hour run. The goal of this procedure is start high BUT have an average total run equal to the continuous power output (8) over the 10 hour period. I am looking for an equation(s) that will allow me to calaculate the low power value and the time (t2) that we must make the switch from continuous to low.

Any thoughts would be appreciated!