Time:

T: timeframe (e.g. 1 year)

T/n = t: time increment (e.g. if n=12, then 1yr becomes one month)

Starting Value: A(0)

- Value as of now (t(0))

Intermediate Value: A(t(n))

- Value at any point between start and end

Ending Value: A(T)

- Value as of ending point (t(T))

Threshold Value: X

- Known, A(0) < X

Random Variable: R

- 0 < R < 1, behaving according to some known distribution

Variable’s Relationship:

A(n+1) = A(n)*R(n)

Question: How can I figure P(X < A) during timeframe ‘T’?

If A never exceeds X, then the probability is zero, and,

If A exceeds X halfway through the timeframe (T/2), then the probability is 50% (I think that is approximately correct)

I don’t want to generate discrete points and figure an integral for the space created where ‘A’ exists over ‘X’, nor do I want to use a monte-carlo method to run several trials and figure an average from the result, since these would be too computationally expensive (as they have to be repeated many times). If the solution is accurate to within a reasonable tolerance (e.g. 5% error) and is computationally inexpensive that would be ideal.

Please let me know if I should clarify something. Any help appreciated!

William