# Asymptotic notation

It is often difficult to derive the exact running time of an algorithm. In such situations one is forced to settle for approximations of the running time, and usually may only derive the *asymptotic* running time. That is, one studies how the running time of the algorithm increases as the size of the input increases without bound.

In what follows, the only functions considered are those which are defined on the positive integers and take on real values that are always positive from some point onwards. Let / and *д* be two such functions.

- 2.55 Definition
*(order notation)* - (i) (
*asymptotic upper bound) f(n)*=*0**(g(n))*if there exists a positive constant c and a positive integer*no*such that 0 <*f(n) < cg(n)*for all*n*> n_{0}.

- (ii)
*(asymptotic lower bound) f(n) = (l(g(n))*if there exists a positive constant c and a positive integer no such that 0 <*cg(n) < f(n)*for all*n >*n_{0}. - (iii)
*(asymptotic tight bound) f(n) = Q(g(n))*if there exist positive constants ci and c_{2}, and a positive integer n_{0}such that*cg{n) < f(n) < c^gin)*for all n > n_{0}. - (iv) (
*o-nototion*) /(n) =*o(g(n) )*if for any positive constant c > 0 there exists a constant no > 0 such that 0 < /(n) <*cg(n)*for all n >*щ.*

Intuitively, /(n) = *0(g(n))* means that / grows no faster asymptotically than *g(n)* to within a constant multiple, while /(n) = *fl(g(n))* means that /(n) grows at least as fast asymptotically as *g(n)* to within a constant multiple. *f(n) = o(g(n))* means that *g(n)* is an upper bound for /(n) that is not asymptotically tight, or in other words, the function *f(n) *becomes insignificant relative to *g(n)* as *n* gets larger. The expression o(l) is often used to signify a function *f(n)* whose limit as *n* approaches oo is 0.

- 2.56 Fact
*(properties of order notation*) For any functions /(n),*g{n), h(n),*and /(»), the folio whig are true. - (i)
*f(n) = 0{g(n))*if and only if*g(n) =*0(/(n)). - (ii)
*f{n) = &{g{n))*if and only if*f(n) = 0(g(n))*and*f(n) = Q(g(n)).* - (iii) If
*f(n*) =*0(h(n))*and*g(n) = 0(h(n)),*then (/ +*g)(n*) =*0(h(n)).* - (iv) If
*f(n) = 0{h{n))*and*g(n) =*0(l(n)), then (/ •*g)(n) = 0(h(n)l{n)).* - (v)
**(reflexivity)***f(n) = 0(f(nj).* - (vi)
*(transitivity) If f(n*) =*0(g(n))*and*g(n) = 0(h(n)),*then*f(n) = 0(h(n)).* - 2.57 Fact
*(approximations of some commonly occurring functions)* - (i)
*(polynomial function)*If /(n) is a polynomial of degree*к*with positive leading term, then*f(n) =*@(n^{fc}). - (ii) For any constant c > 0, log
_{c}*n*= 0(lg*n).* - (iii)
*(Stirling’s formula)*For all integers n > 1,

Thus n! = *V2^i* (f)” (1 + ©(£)). Also, n! = «(ir") andn! = Q(2^{n}).

- (iv) lg(n!) = 0(nlgn).
- 2.58 Example
*(comparative growth rates of some functions)*Let*e*and*c*be arbitrary constants with 0 <*e <*1 < c. The following functions are listed hi increasing order of their asymptotic growth rates:

□