Page 1 of 2 12 LastLast
Results 1 to 15 of 17

Math Help - understanding growth of function

  1. #1
    Newbie
    Joined
    Nov 2009
    Posts
    20

    understanding growth of function

    in my "discrete mathematics and its applications" book it is written:
    |f(x)| <= C|g(x)| whenever x > k

    "f is big oh of g"

    the constants C and k in the definition of bigO notation are called witnesses to the relationship f is O(g). To establish that f is O(g) we need only one pair of witnesses to this relationship. ... Note that when there are one pair of witnesses to the relationship f is O(g), there are infinitely many pairs of witnesses. To see this, note that if C and k are one pair of witnesses, then any pair C' and k' where C < C' and k < k', is also a pair of witnesses, since
    |f(x)| <= C|g(x)| = C'g(x) whenever x > k' > k.
    I dont see why the "proof" is true: if we choose a k' > k and k' is really A LOT bigger than k and if we choose at the same time a C' which is just a little bit bigger than C - why are we still "winning" ?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor
    Joined
    Oct 2009
    Posts
    5,528
    Thanks
    773
    It is easier to see this if you change C and k separately. So, suppose that

    |f(x)| <= C|g(x)| for all x > k

    Also, suppose that C' >= C and k' >= k. Since |f(x)| <= C|g(x)| implies |f(x)| <= C'|g(x)| for any x, we have that

    |f(x)| <= C'|g(x)| for all x > k

    Now if some property holds for all x > k, and if k' >= k, then this property holds for all x > k'. This is because the set {x | x>= k'} is a subset of {x | x>= k}.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Nov 2009
    Posts
    20
    ahh i think the idea of a subset did make it clear, thx!

    i have a further question - let make this clear with an example:

    why is f(n) = n; f is in O(log n) ?

    how do i proof log n <= c * n ?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor
    Joined
    Oct 2009
    Posts
    5,528
    Thanks
    773
    why is f(n) = n; f is in O(log n) ?
    n is not O(log n). It's the other way around, log n = O(n).

    To show this, prove that n < 2^n for all n=0,1,2, ... This can be shown by induction. Then take \log_2 of both sides. Since log is a monotonic function, we get \log_2 n < n.

    Similar same thing happens for logarithms to other bases (at least those greater than 1). Namely, (\log_a n)/(\log_b n) is a constant, i.e., does not depend on n. This follows from one of logarithm laws: \log_a b\log_b c=\log_a c.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Newbie
    Joined
    Nov 2009
    Posts
    20
    Quote Originally Posted by emakarov View Post
    n is not O(log n). It's the other way around, log n = O(n).

    To show this, prove that n < 2^n for all n=0,1,2, ... This can be shown by induction. Then take \log_2 of both sides. Since log is a monotonic function, we get \log_2 n < n.

    Similar same thing happens for logarithms to other bases (at least those greater than 1). Namely, (\log_a n)/(\log_b n) is a constant, i.e., does not depend on n. This follows from one of logarithm laws: \log_a b\log_b c=\log_a c.
    i am allowed to show it for ln n and extrapolate this on logarithm to every base, since the base of a logarithm is "something like a constant multiplier", right?

    the induction proof seems easy:
    induction start ( N = 1...infinity)
    1 < 2
    assumption:
    n < 2^n
    induction step:
    n+1 < 2^(n+1)
    n+1 < 2^n*2
    n < n+1 < 2^n <2^n*2
    n <  2^n induction step succeded. i am right here?

    next question which arises while i am studying growth of functions:
    2^n \in O(2^{2n})
    the reason for this is, that there is a x which makes 2^n <=2^{2n} true. - iam right on this?
    namely: n_0 = 1+ log(1/c) (this is the x-position where this is equation is starting to be true) - iam right here? how did the author found n_0 ?
    Last edited by dayscott; December 10th 2009 at 03:43 PM.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor
    Joined
    Oct 2009
    Posts
    5,528
    Thanks
    773
    i am allowed to show it for ln n and extrapolate this on logarithm to every base, since the base of a logarithm is "something like a constant multiplier", right?
    Pretty much. I think that \ln n usually denotes natural logarithm (base e), and instead of 'the base of a logarithm is "something like a constant multiplier"' I would say, "going from \log_a n to \log_b n amounts to multiplying by a constant", but the meaning is similar. In fact, I would not put it in words at all and just write that (\log_a n)/(\log_b n) is a constant.

    Concerning your induction proof, it may be correct, but after looking at it for some time I am not sure I understand the reasoning, which should not happen in such simple case. For the induction step, I suggest the following. (1) Write the induction hypothesis (IH) (you did that). (2) Write what you need to prove. (3) Show the flow of reasoning: does this line imply the next or vice versa? Ideally, you should have a sequence of lines where each one implies the next, and this is shown (e.g., by writing =>). (4) Indicate the place where you applied the IH. (5) Finish by actually proving the statement from step (2). Here you finish with n < 2^n, and it is not clear whether this is what you've been proving.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Newbie
    Joined
    Nov 2009
    Posts
    20
    i try again:
    the induction proof #2 try:
    induction start ( N = 1...infinity)
    1 < 2
    assumption:
    n < 2^n ( P(n) )
    i want to know whether P(n) => P(n+1) is true
    thus i have to show that ( P (n+1)) is true, i start with:
    n+1 < 2^(n+1)
    i take the what i want to show and work with it :
    n < n+1 < 2^n*2 (n is smaller than n+1)
    n < n+1 < 2^n <2^n*2 ( 2^n is smaller than 2^n*2)
    n <  2^n with ordinary transformation i end up with my assumption, so P(n+1) must be true if P(n) is true.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    MHF Contributor
    Joined
    Oct 2009
    Posts
    5,528
    Thanks
    773
    Showing that 2^n is O(2^{2n}) is simpler. Here 2^n<2^{2n} for all n>1. Indeed, 2^{2n}=(2^n)^2, and m<m^2 for every m>1 (and 2^n>1 when n>1).

    Cleverbot
    Follow Math Help Forum on Facebook and Google+

  9. #9
    MHF Contributor
    Joined
    Oct 2009
    Posts
    5,528
    Thanks
    773
    Quote Originally Posted by dayscott View Post
    i try again:
    the induction proof #2 try:
    induction start ( N = 1...infinity)
    1 < 2
    assumption:
    n < 2^n ( P(n) )
    i want to know whether P(n) => P(n+1) is true
    thus i have to show that ( P (n+1)) is true, i start with:
    n+1 < 2^(n+1)
    i take the what i want to show and work with it :
    n < n+1 < 2^n*2 (n is smaller than n+1)
    n < n+1 < 2^n <2^n*2 ( 2^n is smaller than 2^n*2)
    n <  2^n with ordinary transformation i end up with my assumption, so P(n+1) must be true if P(n) is true.
    Sorry, I still am not sure I understand. So, in your proof each line implies the previous one. OK. We start with n<2^n, which is the IH. How does it follow from here that n < n+1 < 2^n <2^n*2? Presumably, this last formula means that n < n+1 (of course) and n+1 < 2^n (where does this follow from?) and 2^n < 2^n\cdot 2 (yes). Besides, how did you use n<2^n in showing all this?

    Maybe I am tired and need to turn myself off for a couple of hours...

    Cleverbot
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Newbie
    Joined
    Nov 2009
    Posts
    20
    how would you compare these functions? (concenring bigO and littleO)
    <br />
f1(x)= 2^{ln n}
    <br />
f2(x)= n
    <br />
f3(x) = (ln n)^{ln n}
    <br />
f4(x) = (log n^2)^2

    my assumptions:

    1. claim: f2 \in O(f1)
    proof:
    n <= c* 2^{ln n }
    ln n <= ln(c* 2^{ln n})
    ln n <= ln(c) + ln(2^{ln n})
    ln n <= ln n* ln(2) (get rid of constant factor)

    q.e.d

    2. claim: f4 \in o(f3)
    proof:
    (logn^2)^2 < c* (ln n)^{ln n}
    (2logn)*(2logn) <  (c* ln n * c* ln n * c* ln n ...) ( ln n times)

    as soon as ln n > 2 , "f4 \in O(f3)" is true. ln n = 2 is true if n = e^2 , for all x > e^2 its true q.e.d

    3. claim: f1 \in o(f3)
    proof:
     2^{ln n} < c* (ln n) ^{ln n}
    2  <  c* ln n

    4. claim: f4 \in o(f2)
    (log n^2)^2 < c* n
    (2log n)^2 <= c* n
    2*(log n) * (logn) <= c * n
    or is it the other way round ?? i cant tell...
    n < c * 2*(log n) * (logn)
    Follow Math Help Forum on Facebook and Google+

  11. #11
    MHF Contributor
    Joined
    Oct 2009
    Posts
    5,528
    Thanks
    773
    I agree with the claims you made.

    I'll address #1 in the following post.

    2. claim: f4 \in o(f3)
    proof:
    (logn^2)^2 < c* (ln n)^{ln n}
    (2logn)*(2logn) < (c* ln n * c* ln n * c* ln n ...) ( ln n times)
    (0) You only show that (\ln n^2)^2\in O((\ln n)^{\ln n}), not small-o. For that, you need to show that the limit of the first function divided by the second is 0. However, I do believe that small-o claim is also true.

    (1) Why do repeatedly multiply c, along with \ln n? I believe c is not raised to the power \ln n.

    (2) It's not good to write (c\cdot\ln n * c\cdot\ln n * c\cdot\ln n ...) ( \ln n times). The number \ln n may be fractional, and it is not clear how to repeat c\ln n, say, \sqrt{3} times. (Is it going to be c\ln n\cdot c\ln ? ) You may make take lower bound: \ln n\cdot\dots\ln n ( \lfloor \ln n\rfloor times), which is less than (\ln n)^{\ln n}, or something like that.

    (3) It does not make sense to say "as soon as \ln n > 2 , f_4 \in O(f_3) is true". The fact f_4 \in O(f_3) is either true or false. It cannot be true for some n and false for others. It concerns functions as a whole.

    For #3 and #4, you again only show big-O. For #4: (\log n)^\alpha\in o(n^\beta) for constants \alpha, \beta. I am not sure right now how to show this; may need to search online.
    Follow Math Help Forum on Facebook and Google+

  12. #12
    MHF Contributor
    Joined
    Oct 2009
    Posts
    5,528
    Thanks
    773
    A style of a mathematical proof may range from fully formal (and even formalized, by checking a proof using a special computer program that understands the laws of reasoning) to very informal. Let me list several characteristics of each style and situations where they are used.

    (1) Strict proofs are more likely to proceed by definition than non-strict ones. In a strict proof, one may give precise values of C and n_0 and show that |f(n)|<C|g(n)| for all n\ge n_0. In a less formal proof, one is more likely to invoke derived facts, such as that f(n) and af(n), where a is a constant, are big-O of each other.

    (2) Strict proofs are careful about properly introducing each concept and each new variable by phrases like "Fix an arbitrary n" or "Consider x such that ...". Non-strict proofs may be less scrupulous.

    (3) Similarly, strict proofs use approved laws of reasoning, while non-strict ones may resemble the trace of a person's search for a proof rather than the proof itself. For example, mathematical journal articles present finished results, and their goal is providing enough justification. The outline of such articles may be very different from the process the author himself arrived at the result. Informal presentations, on the other hand, try to convince the audience why some new approach or method is natural and useful while hiding technical details.

    (4) In a strict proof, one is concerned with precision; being brief or conveying an intuitive idea is less important. In non-strict proofs one is likely to use the shortest and simplest approach.

    (5) It is often implicitly expected that the simpler the problem is, the stricter the solution should be. A basic question is usually asked either when the person asking it is not fluent in the material and so needs detailed explanation, or when the question is addressed to students who are learning the topic and the goal is to lay firm foundation. In contrast, more advanced questions are usually discussed among people for whom the basics do not present much difficulty, so these people can afford being casual without being vague or confusing. Besides, it is harder to write long proofs with the same level of details as short ones.

    (6) Finally, if a student misses a detail or makes a mistake, the instructor is more likely to view the student's work as less strict, depending on the importance of the detail or the severity of the error.

    Let's now look where your proofs fall with respect to these points.

    Showing n\in O(2^{\ln n}):
    proof:
    n <= c* 2^{ln n }
    ln n <= ln(c* 2^{ln n})
    ln n <= ln(c) + ln(2^{ln n})
    ln n <= ln n* ln(2) (get rid of constant factor)
    (1) The proof is by definition => strict.

    (2) Not every variable (e.g., c) is properly introduced => non-strict.

    (3) I'd like to stress that starting from the claim one needs to prove and rewriting it until one gets a true statement (as was also done in previous posts) is not a strict proof. A formal proof must start with what is known and give reasons why the final claim is true => non-strict.

    (4) If \ln n is the logarithm to the base 2, why would one not rewrite 2^{\ln n} as n, so that f_1(n)=f_2(n) and the claim is obvious? So if \ln n=\log_2 n, the proof seems strict.

    (5) The problem is simple => strict.

    (6) This is similar to (4). If \ln is binary logarithm, not rewriting 2^{\ln n} as n is strange. If \ln n is natural logarithm, i.e., \log_e, which is more standard, then the claim that n\in O(2^{\ln n}) is not true. Indeed, 2^{\ln n}=2^{(log_2 n)/(\log_2 e)}=n^{1/(\log_2 e)}. Now 2 < e, so \log_2 e>1 and 1/(\log_2 e)<1. Then n is not O(n^\alpha) where \alpha < 1. In both cases, the proof gives an impression of a non-strict one.

    So you see, your style is all over the place. Imagine an impression that a judge would get from a defense lawyer's court paper which is written in good legalese but which lists all kinds of facts, both in favor and against his client. It would be pretty confusing to read.
    Follow Math Help Forum on Facebook and Google+

  13. #13
    Newbie
    Joined
    Nov 2009
    Posts
    20
    gee.

    i learned a lot from the last post. THANK YOU .

    actually this task was to rearrange these 4 functions concerning bigO and littleO , as we see now these are actually 3 functions which i have to arrange- but you are right, in order to show littleO i have to use limes.

    EDIT:

    Quote Originally Posted by emakarov View Post
    (4) If \ln n is the logarithm to the base 2, why would one not rewrite 2^{\ln n} as n, so that f_1(n)=f_2(n) and the claim is obvious? .
    ln n is NOT the logarithm base 2. it is logarithm base e. that is an international convention .
    Last edited by dayscott; December 11th 2009 at 01:55 PM.
    Follow Math Help Forum on Facebook and Google+

  14. #14
    Newbie
    Joined
    Nov 2009
    Posts
    20
    concerning grwoth of functions i have a question:
    is <br />
 3^k  \in  O ( 2^k ) <br />
(*)

    3^k  = c *   2^k

    i can find a k and a c.
    For k = 1 and c = 2 , it (*) is true right?
    Follow Math Help Forum on Facebook and Google+

  15. #15
    MHF Contributor
    Joined
    Oct 2009
    Posts
    5,528
    Thanks
    773
    The definition of f(k)\in O(g(k)) asks for an index N such that f(k)\le Cg(k) for all k > N. You are checking only one k here.

    Keep in mind that the definition of big-O considers functions as a whole, looking at the trend for large k instead of looking at some particular k in isolation. The first intuitive idea about f(k)\in O(g(k)) is " f(k) is dominated by g(k) eventually", i.e., from some point onward. Think about the graphs of f and g: this means that if you zoom out far enough, the graph of f will lie below the graph of g from some point and all the way to the right. So, does 3^k become less than 2^k eventually?

    A better approximation to the definition is " f(k) is dominated by twice g(k) eventually". Does 3^k become less than 2\cdot 2^k from some point on? The real definition is " f(k) is dominated by C\cdot g(k) eventually for some constant C". This is the same thing as " f(k)/g(k) is dominated by some constant C eventually". If you consider 3^k/2^k, can you give an upper bound that it never exceeds, or will this ratio become bigger than any constant that one can name?

    (I ignored absolute values in this post considering both f and g to be nonnegative.)
    Follow Math Help Forum on Facebook and Google+

Page 1 of 2 12 LastLast

Similar Math Help Forum Discussions

  1. [SOLVED] Understanding A Function using its derivative?
    Posted in the Calculus Forum
    Replies: 2
    Last Post: May 7th 2011, 10:39 PM
  2. Understanding Function Notation?
    Posted in the Algebra Forum
    Replies: 1
    Last Post: May 3rd 2011, 10:18 AM
  3. growth of a function
    Posted in the Discrete Math Forum
    Replies: 0
    Last Post: September 15th 2009, 09:14 PM
  4. growth function
    Posted in the Advanced Math Topics Forum
    Replies: 1
    Last Post: February 3rd 2009, 07:45 AM
  5. Growth function
    Posted in the Calculus Forum
    Replies: 4
    Last Post: April 18th 2008, 12:01 PM

Search Tags


/mathhelpforum @mathhelpforum