i'm looking for a little help understanding something, apologies if I have posted to the wrong place. I have a standard curve which interpolates a value with a percentage between 0-100.
The curve performs well and we are using it to accurately predict unknows percentages. However, when plotting the data and fitting the curve if we include the reading for 100% the lower end of the curve performs very strangely (See attched files).
Can anyone shed any light on what is happening here, my only thought is it is being caused but the curve reaching near vertical for the 100% readings.