My question relates to the above and which is better to use when comparing two values.
I have a value X. If then (randomly) produce a value Y. Y can be accepted if Y = X is to within 2%.
In order to find this should I be using % change (i.e. ((final - initial)/initial) * 100) or % difference (i.e. ((final - initial)/((final + initial)/2))*100)?
In what situations would one be used as opposed to the other.
Any opinions would be gratefully received.