I have a problem where by I have the normally distributed rate of change in a variable. Mean is zero, stdev is 0.013706.

I have been asked to find what is the probability of the variable being below a threshold after two instances. ie; if F(1) is the current value and R(1)=(F(1)-F(0))/(F(0), with R being normally distributed as above, then how do I calculate the chances of (1+R(1))*(1+R(2))=constant,?

If any clarification is needed then don't hesitate to specify.

Alternatively I could use the form 1+R(1)+R(2)=constant, but that would not be as accurate so the form above is preferred.

Is there some way I can approximate the two instances as one and from there use the conventional approach? Or must I account for each uniquely?