# Thread: Best way to measure the accuracy of result of function?

1. ## Best way to measure the accuracy of result of function?

What is the best way to measure the accuracy of function, which gives set of elements as output and we know the desired set of elements.

Suppose,
Actual expected output = {a1, a3}
Experimental output = {a1, a5, a6}

The accuracy measure should consider the number of correct elements and also the number of wrong elements.

2. ## Re: Best way to measure the accuracy of result of function?

What options have you thought about?

something very simple might be "number of correct elements - number of wrong elements". Higher numbers indicate better performance.

If you want something more sophisticated I suggest you think about what kinds of errors are most important. For example if the function gives an answer that is 100 units too big, is that more/less/equally significant than another function which gives 2 wrong answers which are 50 units too big?

3. ## Re: Best way to measure the accuracy of result of function?

Originally Posted by SpringFan25
What options have you thought about?

something very simple might be "number of correct elements - number of wrong elements". Higher numbers indicate better performance.

If you want something more sophisticated I suggest you think about what kinds of errors are most important. For example if the function gives an answer that is 100 units too big, is that more/less/equally significant than another function which gives 2 wrong answers which are 50 units too big?
Thank you. I'm also thinking in the same lines. I think below one works,
Error = (no.of correct ones missing + no.of wrong ones appear in result) / (no.of items).

ex, suppose set of elements {a1, a2, a3, a4, a5}
correct items = {a1, a4}
wrong items = {a2, a3, a5}
then if function returned {a1, a3, a5}, then error = (1+2)/5 = 0.6 or 60%