I was wondering if there is a way to quantify scalability in software engineering.
Lets say you have a process which outputs a file. I want to see how this process scales when I run it in parallel.
I am currently doing,
1 thread(s) = 60 seconds
2 thread(s) = 60 seconds
4 threads(s) = 63 seconds
Is there a way I can model this? I was thinking of the BigO notation, but are there any applications in statistics I can use for something like this? Such as r-square?