Asymptotic Upper Bounds
Recall this exercise from last chapter:
You are the CEO of a startup software company. The app your company is building requires a component that performs some sort of image processing task. Your intern is told to survey the literature for potential solutions to this computational task. Upon surveying the literature, the intern finds three algorithms that solve the problem at hand. The algorithms have the following runtimes: $O(\lg n)$, $O(2^n)$, and $O(n)$. Which algorithm will you choose to be implemented?
Relying only on the mathematical notion of big Oh, it's possible that these three algorithms might all take a logarithmic amount of time to run, because $\lg n \in O(\lg n)$, $\lg n \in O(n)$, and $\lg n \in O(2^n)$.
It would be mathematically justified, but very strange (and unhelpful), for the literature which the intern studied to have said that three different logarithmic-time algorithms were in $O(\lg n)$, $O(n)$, and $O(2^n)$, respectively.
Case in point: When we use big Oh to communicate how fast an algorithm is, we give the smallest big Oh time (tightest upper bound) we can prove to be true.