# Thread: understanding growth of function

1. ## understanding growth of function

in my "discrete mathematics and its applications" book it is written:
|f(x)| <= C|g(x)| whenever x > k

"f is big oh of g"

the constants C and k in the definition of bigO notation are called witnesses to the relationship f is O(g). To establish that f is O(g) we need only one pair of witnesses to this relationship. ... Note that when there are one pair of witnesses to the relationship f is O(g), there are infinitely many pairs of witnesses. To see this, note that if C and k are one pair of witnesses, then any pair C' and k' where C < C' and k < k', is also a pair of witnesses, since
|f(x)| <= C|g(x)| = C'g(x) whenever x > k' > k.
I dont see why the "proof" is true: if we choose a k' > k and k' is really A LOT bigger than k and if we choose at the same time a C' which is just a little bit bigger than C - why are we still "winning" ?

2. It is easier to see this if you change C and k separately. So, suppose that

|f(x)| <= C|g(x)| for all x > k

Also, suppose that C' >= C and k' >= k. Since |f(x)| <= C|g(x)| implies |f(x)| <= C'|g(x)| for any x, we have that

|f(x)| <= C'|g(x)| for all x > k

Now if some property holds for all x > k, and if k' >= k, then this property holds for all x > k'. This is because the set {x | x>= k'} is a subset of {x | x>= k}.

3. ahh i think the idea of a subset did make it clear, thx!

i have a further question - let make this clear with an example:

why is f(n) = n; f is in O(log n) ?

how do i proof log n <= c * n ?

4. why is f(n) = n; f is in O(log n) ?
n is not O(log n). It's the other way around, log n = O(n).

To show this, prove that $\displaystyle n < 2^n$ for all $\displaystyle n=0,1,2$, ... This can be shown by induction. Then take $\displaystyle \log_2$ of both sides. Since log is a monotonic function, we get $\displaystyle \log_2 n < n$.

Similar same thing happens for logarithms to other bases (at least those greater than 1). Namely, $\displaystyle (\log_a n)/(\log_b n)$ is a constant, i.e., does not depend on $\displaystyle n$. This follows from one of logarithm laws: $\displaystyle \log_a b\log_b c=\log_a c$.

5. Originally Posted by emakarov
n is not O(log n). It's the other way around, log n = O(n).

To show this, prove that $\displaystyle n < 2^n$ for all $\displaystyle n=0,1,2$, ... This can be shown by induction. Then take $\displaystyle \log_2$ of both sides. Since log is a monotonic function, we get $\displaystyle \log_2 n < n$.

Similar same thing happens for logarithms to other bases (at least those greater than 1). Namely, $\displaystyle (\log_a n)/(\log_b n)$ is a constant, i.e., does not depend on $\displaystyle n$. This follows from one of logarithm laws: $\displaystyle \log_a b\log_b c=\log_a c$.
i am allowed to show it for ln n and extrapolate this on logarithm to every base, since the base of a logarithm is "something like a constant multiplier", right?

the induction proof seems easy:
induction start ( N = 1...infinity)
$\displaystyle 1 < 2$
assumption:
$\displaystyle n < 2^n$
induction step:
$\displaystyle n+1 < 2^(n+1)$
$\displaystyle n+1 < 2^n*2$
$\displaystyle n < n+1 < 2^n <2^n*2$
$\displaystyle n < 2^n$ induction step succeded. i am right here?

next question which arises while i am studying growth of functions:
$\displaystyle 2^n \in O(2^{2n})$
the reason for this is, that there is a x which makes $\displaystyle 2^n <=2^{2n}$ true. - iam right on this?
namely: $\displaystyle n_0 = 1+ log(1/c)$ (this is the x-position where this is equation is starting to be true) - iam right here? how did the author found $\displaystyle n_0$?

6. i am allowed to show it for ln n and extrapolate this on logarithm to every base, since the base of a logarithm is "something like a constant multiplier", right?
Pretty much. I think that $\displaystyle \ln n$ usually denotes natural logarithm (base $\displaystyle e$), and instead of 'the base of a logarithm is "something like a constant multiplier"' I would say, "going from $\displaystyle \log_a n$ to $\displaystyle \log_b n$ amounts to multiplying by a constant", but the meaning is similar. In fact, I would not put it in words at all and just write that $\displaystyle (\log_a n)/(\log_b n)$ is a constant.

Concerning your induction proof, it may be correct, but after looking at it for some time I am not sure I understand the reasoning, which should not happen in such simple case. For the induction step, I suggest the following. (1) Write the induction hypothesis (IH) (you did that). (2) Write what you need to prove. (3) Show the flow of reasoning: does this line imply the next or vice versa? Ideally, you should have a sequence of lines where each one implies the next, and this is shown (e.g., by writing =>). (4) Indicate the place where you applied the IH. (5) Finish by actually proving the statement from step (2). Here you finish with $\displaystyle n < 2^n$, and it is not clear whether this is what you've been proving.

7. i try again:
the induction proof #2 try:
induction start ( N = 1...infinity)
$\displaystyle 1 < 2$
assumption:
$\displaystyle n < 2^n$ ( P(n) )
i want to know whether P(n) => P(n+1) is true
thus i have to show that ( P (n+1)) is true, i start with:
$\displaystyle n+1 < 2^(n+1)$
i take the what i want to show and work with it :
$\displaystyle n < n+1 < 2^n*2$ (n is smaller than n+1)
$\displaystyle n < n+1 < 2^n <2^n*2$ ( 2^n is smaller than 2^n*2)
$\displaystyle n < 2^n$ with ordinary transformation i end up with my assumption, so P(n+1) must be true if P(n) is true.

8. Showing that $\displaystyle 2^n is O(2^{2n})$ is simpler. Here $\displaystyle 2^n<2^{2n}$ for all $\displaystyle n>1$. Indeed, $\displaystyle 2^{2n}=(2^n)^2$, and $\displaystyle m<m^2$ for every $\displaystyle m>1$ (and $\displaystyle 2^n>1$ when $\displaystyle n>1$).

Cleverbot

9. Originally Posted by dayscott
i try again:
the induction proof #2 try:
induction start ( N = 1...infinity)
$\displaystyle 1 < 2$
assumption:
$\displaystyle n < 2^n$ ( P(n) )
i want to know whether P(n) => P(n+1) is true
thus i have to show that ( P (n+1)) is true, i start with:
$\displaystyle n+1 < 2^(n+1)$
i take the what i want to show and work with it :
$\displaystyle n < n+1 < 2^n*2$ (n is smaller than n+1)
$\displaystyle n < n+1 < 2^n <2^n*2$ ( 2^n is smaller than 2^n*2)
$\displaystyle n < 2^n$ with ordinary transformation i end up with my assumption, so P(n+1) must be true if P(n) is true.
Sorry, I still am not sure I understand. So, in your proof each line implies the previous one. OK. We start with $\displaystyle n<2^n$, which is the IH. How does it follow from here that $\displaystyle n < n+1 < 2^n <2^n*2$? Presumably, this last formula means that $\displaystyle n < n+1$ (of course) and $\displaystyle n+1 < 2^n$ (where does this follow from?) and $\displaystyle 2^n < 2^n\cdot 2$ (yes). Besides, how did you use $\displaystyle n<2^n$ in showing all this?

Maybe I am tired and need to turn myself off for a couple of hours...

Cleverbot

10. how would you compare these functions? (concenring bigO and littleO)
$\displaystyle f1(x)= 2^{ln n}$
$\displaystyle f2(x)= n$
$\displaystyle f3(x) = (ln n)^{ln n}$
$\displaystyle f4(x) = (log n^2)^2$

my assumptions:

1. claim: $\displaystyle f2 \in O(f1)$
proof:
$\displaystyle n <= c* 2^{ln n }$
$\displaystyle ln n <= ln(c* 2^{ln n})$
$\displaystyle ln n <= ln(c) + ln(2^{ln n})$
$\displaystyle ln n <= ln n* ln(2)$ (get rid of constant factor)

q.e.d

2. claim: $\displaystyle f4 \in o(f3)$
proof:
$\displaystyle (logn^2)^2 < c* (ln n)^{ln n}$
$\displaystyle (2logn)*(2logn) < (c* ln n * c* ln n * c* ln n ...)$ ( ln n times)

as soon as ln n > 2 , "f4 \in O(f3)" is true. ln n = 2 is true if n = e^2 , for all x > e^2 its true q.e.d

3. claim: $\displaystyle f1 \in o(f3)$
proof:
$\displaystyle 2^{ln n} < c* (ln n) ^{ln n}$
$\displaystyle 2 < c* ln n$

4. claim: $\displaystyle f4 \in o(f2)$
$\displaystyle (log n^2)^2 < c* n$
$\displaystyle (2log n)^2 <= c* n$
$\displaystyle 2*(log n) * (logn) <= c * n$
or is it the other way round ?? i cant tell...
$\displaystyle n < c * 2*(log n) * (logn)$

11. I agree with the claims you made.

I'll address #1 in the following post.

2. claim: $\displaystyle f4 \in o(f3)$
proof:
$\displaystyle (logn^2)^2 < c* (ln n)^{ln n}$
$\displaystyle (2logn)*(2logn) < (c* ln n * c* ln n * c* ln n ...)$ ($\displaystyle ln n$ times)
(0) You only show that $\displaystyle (\ln n^2)^2\in O((\ln n)^{\ln n})$, not small-o. For that, you need to show that the limit of the first function divided by the second is 0. However, I do believe that small-o claim is also true.

(1) Why do repeatedly multiply $\displaystyle c$, along with $\displaystyle \ln n$? I believe $\displaystyle c$ is not raised to the power $\displaystyle \ln n$.

(2) It's not good to write $\displaystyle (c\cdot\ln n * c\cdot\ln n * c\cdot\ln n ...)$ ($\displaystyle \ln n$ times). The number $\displaystyle \ln n$ may be fractional, and it is not clear how to repeat $\displaystyle c\ln n$, say, $\displaystyle \sqrt{3}$ times. (Is it going to be $\displaystyle c\ln n\cdot c\ln$ ? ) You may make take lower bound: $\displaystyle \ln n\cdot\dots\ln n$ ($\displaystyle \lfloor \ln n\rfloor$ times), which is less than $\displaystyle (\ln n)^{\ln n}$, or something like that.

(3) It does not make sense to say "as soon as $\displaystyle \ln n > 2$ , $\displaystyle f_4 \in O(f_3)$ is true". The fact $\displaystyle f_4 \in O(f_3)$ is either true or false. It cannot be true for some $\displaystyle n$ and false for others. It concerns functions as a whole.

For #3 and #4, you again only show big-O. For #4: $\displaystyle (\log n)^\alpha\in o(n^\beta)$ for constants $\displaystyle \alpha, \beta$. I am not sure right now how to show this; may need to search online.

12. A style of a mathematical proof may range from fully formal (and even formalized, by checking a proof using a special computer program that understands the laws of reasoning) to very informal. Let me list several characteristics of each style and situations where they are used.

(1) Strict proofs are more likely to proceed by definition than non-strict ones. In a strict proof, one may give precise values of $\displaystyle C$ and $\displaystyle n_0$ and show that $\displaystyle |f(n)|<C|g(n)|$ for all $\displaystyle n\ge n_0$. In a less formal proof, one is more likely to invoke derived facts, such as that $\displaystyle f(n)$ and $\displaystyle af(n)$, where $\displaystyle a$ is a constant, are big-O of each other.

(2) Strict proofs are careful about properly introducing each concept and each new variable by phrases like "Fix an arbitrary $\displaystyle n$" or "Consider $\displaystyle x$ such that ...". Non-strict proofs may be less scrupulous.

(3) Similarly, strict proofs use approved laws of reasoning, while non-strict ones may resemble the trace of a person's search for a proof rather than the proof itself. For example, mathematical journal articles present finished results, and their goal is providing enough justification. The outline of such articles may be very different from the process the author himself arrived at the result. Informal presentations, on the other hand, try to convince the audience why some new approach or method is natural and useful while hiding technical details.

(4) In a strict proof, one is concerned with precision; being brief or conveying an intuitive idea is less important. In non-strict proofs one is likely to use the shortest and simplest approach.

(5) It is often implicitly expected that the simpler the problem is, the stricter the solution should be. A basic question is usually asked either when the person asking it is not fluent in the material and so needs detailed explanation, or when the question is addressed to students who are learning the topic and the goal is to lay firm foundation. In contrast, more advanced questions are usually discussed among people for whom the basics do not present much difficulty, so these people can afford being casual without being vague or confusing. Besides, it is harder to write long proofs with the same level of details as short ones.

(6) Finally, if a student misses a detail or makes a mistake, the instructor is more likely to view the student's work as less strict, depending on the importance of the detail or the severity of the error.

Let's now look where your proofs fall with respect to these points.

Showing $\displaystyle n\in O(2^{\ln n})$:
proof:
$\displaystyle n <= c* 2^{ln n }$
$\displaystyle ln n <= ln(c* 2^{ln n})$
$\displaystyle ln n <= ln(c) + ln(2^{ln n})$
$\displaystyle ln n <= ln n* ln(2)$ (get rid of constant factor)
(1) The proof is by definition => strict.

(2) Not every variable (e.g., $\displaystyle c$) is properly introduced => non-strict.

(3) I'd like to stress that starting from the claim one needs to prove and rewriting it until one gets a true statement (as was also done in previous posts) is not a strict proof. A formal proof must start with what is known and give reasons why the final claim is true => non-strict.

(4) If $\displaystyle \ln n$ is the logarithm to the base 2, why would one not rewrite $\displaystyle 2^{\ln n}$ as $\displaystyle n$, so that $\displaystyle f_1(n)=f_2(n)$ and the claim is obvious? So if $\displaystyle \ln n=\log_2 n$, the proof seems strict.

(5) The problem is simple => strict.

(6) This is similar to (4). If $\displaystyle \ln$ is binary logarithm, not rewriting $\displaystyle 2^{\ln n}$ as $\displaystyle n$ is strange. If $\displaystyle \ln n$ is natural logarithm, i.e., $\displaystyle \log_e$, which is more standard, then the claim that $\displaystyle n\in O(2^{\ln n})$ is not true. Indeed, $\displaystyle 2^{\ln n}=2^{(log_2 n)/(\log_2 e)}=n^{1/(\log_2 e)}$. Now $\displaystyle 2 < e$, so $\displaystyle \log_2 e>1$ and $\displaystyle 1/(\log_2 e)<1$. Then $\displaystyle n$ is not $\displaystyle O(n^\alpha)$ where $\displaystyle \alpha < 1$. In both cases, the proof gives an impression of a non-strict one.

So you see, your style is all over the place. Imagine an impression that a judge would get from a defense lawyer's court paper which is written in good legalese but which lists all kinds of facts, both in favor and against his client. It would be pretty confusing to read.

13. gee.

i learned a lot from the last post. THANK YOU .

actually this task was to rearrange these 4 functions concerning bigO and littleO , as we see now these are actually 3 functions which i have to arrange- but you are right, in order to show littleO i have to use limes.

EDIT:

Originally Posted by emakarov
(4) If $\displaystyle \ln n$ is the logarithm to the base 2, why would one not rewrite $\displaystyle 2^{\ln n}$ as $\displaystyle n$, so that $\displaystyle f_1(n)=f_2(n)$ and the claim is obvious? .
ln n is NOT the logarithm base 2. it is logarithm base e. that is an international convention .

14. concerning grwoth of functions i have a question:
is $\displaystyle 3^k \in O ( 2^k )$ (*)

$\displaystyle 3^k = c * 2^k$

i can find a k and a c.
For k = 1 and c = 2 , it (*) is true right?

15. The definition of $\displaystyle f(k)\in O(g(k))$ asks for an index $\displaystyle N$ such that $\displaystyle f(k)\le Cg(k)$ for all $\displaystyle k > N$. You are checking only one $\displaystyle k$ here.

Keep in mind that the definition of big-O considers functions as a whole, looking at the trend for large $\displaystyle k$ instead of looking at some particular $\displaystyle k$ in isolation. The first intuitive idea about $\displaystyle f(k)\in O(g(k))$ is "$\displaystyle f(k)$ is dominated by $\displaystyle g(k)$ eventually", i.e., from some point onward. Think about the graphs of $\displaystyle f$ and $\displaystyle g$: this means that if you zoom out far enough, the graph of $\displaystyle f$ will lie below the graph of $\displaystyle g$ from some point and all the way to the right. So, does $\displaystyle 3^k$ become less than $\displaystyle 2^k$ eventually?

A better approximation to the definition is "$\displaystyle f(k)$ is dominated by twice $\displaystyle g(k)$ eventually". Does $\displaystyle 3^k$ become less than $\displaystyle 2\cdot 2^k$ from some point on? The real definition is "$\displaystyle f(k)$ is dominated by $\displaystyle C\cdot g(k)$ eventually for some constant $\displaystyle C$". This is the same thing as "$\displaystyle f(k)/g(k)$ is dominated by some constant $\displaystyle C$ eventually". If you consider $\displaystyle 3^k/2^k$, can you give an upper bound that it never exceeds, or will this ratio become bigger than any constant that one can name?

(I ignored absolute values in this post considering both $\displaystyle f$ and $\displaystyle g$ to be nonnegative.)

Page 1 of 2 12 Last