Results 1 to 4 of 4

Math Help - System Performance - Big Oh and Big Omega

  1. #1
    Junior Member
    Joined
    Mar 2009
    Posts
    42

    System Performance - Big Oh and Big Omega

    Hi,

    Apologies in advance for the long mail but I have a lot to say! Iíve posted before about Big Oh notation and sometimes I think I get it and then something comes along and confuses me again.

    As I currently understand it Big Oh notation represents the worst case performance for a system/algorithm. I also understand that Big Omega represents the best case performance for a system/algorithm and from what I can gather Big Theta is when the worst and best case are the same (Please correct me if Iím wrong).

    So Iíve a paper and theyíre measuring how long it takes for the pointers of every device in the system (every device has two pointers and there are N devices in the system) to become correct when all devices boot up. The paper calls this the time complexity. The devices have an opportunity to fix their pointers according to an interval t. So every interval t every device will try to fix its pointers.

    The paper states that for the worst case scenario all devices will have the correct pointers at time= (n+1).t = N.t and for the best case all devices will have the correct pointers at time=(n).t = (N-1).t. So for example for a network with 1000 devices and an interval of 30s, the worst case time complexity is 30000 seconds whereas the best case is 29970 seconds. The paper further states that since both the lower and the upper bound, are O(N), there is a tight bound of Θ(N) on the time complexity Ė This is what I cannot understand as I thought the best case performance would be O(N-1).

    Now when I posted previously I was told that O(n-1) = O(N) and that Big-O notation has nothing to do with specific values of N, but is an asymptotic property.

    Can someone please explain this to me further (maybe in less mathematical language)? I donít see how the best and worst case can be the same and if theyíre the same ďasymptoticallyĒ then what does this asymptotically mean? I don't understand how it would be useful to describe system performance in this way.

    Many thanks in advance for help provided.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by kerrymaid View Post
    Hi,

    Apologies in advance for the long mail but I have a lot to say! Iíve posted before about Big Oh notation and sometimes I think I get it and then something comes along and confuses me again.

    As I currently understand it Big Oh notation represents the worst case performance for a system/algorithm. I also understand that Big Omega represents the best case performance for a system/algorithm and from what I can gather Big Theta is when the worst and best case are the same (Please correct me if Iím wrong).

    So Iíve a paper and theyíre measuring how long it takes for the pointers of every device in the system (every device has two pointers and there are N devices in the system) to become correct when all devices boot up. The paper calls this the time complexity. The devices have an opportunity to fix their pointers according to an interval t. So every interval t every device will try to fix its pointers.

    The paper states that for the worst case scenario all devices will have the correct pointers at time= (n+1).t = N.t and for the best case all devices will have the correct pointers at time=(n).t = (N-1).t. So for example for a network with 1000 devices and an interval of 30s, the worst case time complexity is 30000 seconds whereas the best case is 29970 seconds. The paper further states that since both the lower and the upper bound, are O(N), there is a tight bound of Θ(N) on the time complexity Ė This is what I cannot understand as I thought the best case performance would be O(N-1).

    Now when I posted previously I was told that O(n-1) = O(N) and that Big-O notation has nothing to do with specific values of N, but is an asymptotic property.

    Can someone please explain this to me further (maybe in less mathematical language)? I donít see how the best and worst case can be the same and if theyíre the same ďasymptoticallyĒ then what does this asymptotically mean? I don't understand how it would be useful to describe system performance in this way.

    Many thanks in advance for help provided.
    Well I can't explain with less mathematical language as it is a mathematical concept, but I can point to a conceptual error or misundera=standing that is evident in your post.

    Big-O/Big-Theta notation is not directly anything to do with worst/best case performance of a system. The notation deals with the rate of growth of functions. So if the worst case performance of your system were (exactly) f(n)=n^2+4n+7 then we would say that f(n)=O(n^2) , but that is because of the rate of growth of f(n) . We can as easily write f(n)=\Theta(n^2) in this case, as for large $$ n:

    n^2<f(n)<12n^2

    CB
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Mar 2009
    Posts
    42
    Hi CB,

    Ok - I clearly haven't been thinking about this correctly. The paper goes onto say:

    "We have shown that the time complexity to a consistent state is Θ(N). One might be apt
    to argue that the linear convergence behavior is only due to the assumption that there is a single join point. However,
    the time complexity from a set of k join points is Θ(k+ceil((N −k)/k)) = Θ(N), and thus the asymptotic
    time complexity of the bootstrapping protocol remains unchanged."

    Can you please explain why this is?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by kerrymaid View Post
    Hi CB,

    Ok - I clearly haven't been thinking about this correctly. The paper goes onto say:

    "We have shown that the time complexity to a consistent state is Θ(N). One might be apt
    to argue that the linear convergence behavior is only due to the assumption that there is a single join point. However,
    the time complexity from a set of k join points is Θ(k+ceil((N −k)/k)) = Θ(N), and thus the asymptotic
    time complexity of the bootstrapping protocol remains unchanged."

    Can you please explain why this is?
    If f(N) \in \Theta\left(k + \left\lceil \frac{N-k}{k} \right\rceil \right) then there exist constants a,b>0 such that:

    a\left(k + \left\lceil \frac{N-k}{k} \right\rceil \right)<|f(N)|<b\left(k + \left\lceil \frac{N-k}{k} \right\rceil \right)

    but: U \le \lceil U \rceil \le U+1, so:

    a\left(k +  \frac{N-k}{k}  \right)<|f(N)|<b\left(k + \frac{N-k}{k}+1  \right)

    So for $$ N large enough and k\ge 1

    a\frac{N}{k} <|f(N)|<2b\frac{N}{k}

    and so f(N)\in \Theta(N)

    (The above shows that \Theta\left(k + \left\lceil \frac{N-k}{k} \right\rceil \right)\subseteq \Theta(N) which is all you really require, but the argument can be run in the reverse direction to show that these two equivalence classes of functions are equal)

    CB
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Replies: 3
    Last Post: July 25th 2011, 09:46 AM
  2. Car performance calculations
    Posted in the Statistics Forum
    Replies: 0
    Last Post: June 23rd 2010, 01:33 AM
  3. Big O notation (I think!) to quantify system performance
    Posted in the Differential Geometry Forum
    Replies: 0
    Last Post: June 8th 2009, 11:21 AM
  4. Big Oh and Big Omega
    Posted in the Number Theory Forum
    Replies: 5
    Last Post: May 4th 2008, 12:01 AM
  5. Omega
    Posted in the Discrete Math Forum
    Replies: 1
    Last Post: January 20th 2008, 01:09 PM

Search Tags


/mathhelpforum @mathhelpforum