signal analysis - step function
I have a problem about the characterization of a step function:
I analyze a discrete time discrete valued slow signal over the time.
The signal corresponds to the conductivity measurement of a water solution during the time, the conductivity value in the water solution is controlled by a slow feed-back system, the reference value is change let's suppose an increase in one unit, and some actuators increase the solution conductivity until the new reference value is achieved. So I guess that the function could be described as a step function with a high amortiguation.
I need to detect when the signal arrives to the new value, however the signal is subject to some noise, I mean that even when raising is it possible to find values which are a bit lower when compared with the previous one.
I have programmed an algorithm which iterates through each value comparing it with the previous one, when the increase respect the previous one is not big enough, the algorithm considers it has arrived to a plateau.
I am not sure is my approach is reliable enough or I should try to characterize the function using some mathematical model. If so could you give me some directions to leatn how to do so?