Big Oh Notation

Suppose you have counted the number of steps a program takes and came up with the following that describes the worst-case runtime as a function of the input size:

$$ T(n) = 12754n^{2} + 4353n + 834\lg n + 13546 $$

We can describe the runtime of this program simply as $O(n^2)$ (read it big Oh of $n$ squared).

$T(n)$ is the precise running time whereas $O(n^2)$ is the asymptotic running time of the program.

How do we go from $T(n)$ to $O(n^2)$?

We suppress the constant factors (set them to $1$, resulting in $n^{2} + n + \lg n + 1$) and drop (ignore) the lower-order terms (resulting in $n^{2}$).

This may seem excessively imprecise in particular after we put so much effort to calculate the running time under the RAM model in the previous chapter! It turns out, this approximation is sufficient to compare the efficiency of algorithms.