Discrete Mathematics

Growth of Functions

Hierarchy of Functions

$$ 1 < log(n) < n*log(n) < n^2 < 2^n < n! $$

Eventually true for large enough ’n'.

Definition

Let f,g: Z -> R be functions.
Then 'f' is "big Oh" of 'g',
granted f(x) = O(g(x)) or f = O(g) if there are positive real numbers C,k
such that |f(x)| <= C(|g(x)|).

‘C’, ‘k’ are called witnesses to f(x) = O(g(x)).

eg: f(x) = x^2 + 2x + 1. Give a big-O estimate of ‘f’. Solution: Overestimate f(x) in terms of the largest term.

$$ f(x) = x^2 + 2x + 1 < x^2 + 2x^2 + x^2j \\ f(x) < 4x^2, when x > 1. $$

Let C = 4, k = 1. Then f(x) = O(x^2)

Intuition

‘f’ grows only as fast as its largest term or rather the largest term governs how fast the function grows.

eg: Find a big-O estimate of $f(x) = {(x^4 + x^2 + 1)\over(x^4 + 1)}$.

Sketch: $f(x) = O((x^4)/(x^4)) == O(1)$.

Solution: $f(x) = {(x^4 + x^2 + 1)\over(x^4 + 1)} < {(x^4 + x^2 + 1)\over(x^4)} = x^4/x^4 + x^2/x^4 + 1/x^4.$

$$ = 1 + 1/x^2 + 1/x^4 \\ = 1 + 1 + 1 \\ = 3 * 1. $$

Therefore: $f(x) = O(1)$

eg: Find a big-O estimate for n! Solution: n! = 1 * 2 * 3 * … * (n-1) * n n! < n^n Therefore: n! = O(n^n)

n! < n^n, so then $log(n!) < log(n^n)$ and $log(n!) < n * log(n)$

Therefore: $log(n!) = O(n * log(n)$

(see Stirling’s approximation )

Facts

$n^c = O(n^d)$, if $c < d$. (Higher degree polynamials grow faster)

$log_b(n)^c = O(n^d)$, $b > 1$, $c$, $d$ exist in the domain of all real numbers { $b, c, d: \in \mathbb{R}$ }.

(log grows slower than polynomials)

$b^n = O(c^n)$, $c$, $b$ exist in the domain of all real numbers { $b, c: \in \mathbb{R}$ }.

(polynomials grow slower than exponentials)

Theorem

Let $f_1(x)$ and $f_2(x)$ be functions such that $f_1(x) = O(g_1(x))$ and $f_2(x) = O(g_2(x))$

Then: $f_1(x) + f_2(x) = O(max(g_1(x) + g_2(x))$ and $f_1(x) * f_2(x) = O(g_1(x) * g_2(x))$

Give a big-O estimate of $f(x) = (x + 1) * (x^2 + 1) + 3x^2$

Solution: Find O-estimate for $(x+1) * log(x^2 + 1)$

$x+1: x + 1 < x + x / 2x for x > 1$.

Therefore, $x + 1 = O(x)$.

$log(x^2 + 1): log(x^2 + 1) < log(x^2 + x^2) \forall x > 1$.

= $3*log(x)$

Therefore: $O(log(x))$

So, the big-O is the product of the two separate big-O’s = O(x*log(x)).

Then we have to find the big-O of the remaining piece: $3x^2$, which is $O(x^2)$. The big-O of $O(x*log(x))$ and $O(x^2)$ is the max of the two.

$O(x^2)$.