Average Case
In addition to the best-case and the worst-case, the average-case runtime of an algorithm is the function defined by an average number of steps taken on any instance of size $n$. This is typically calculated considering "random" input instances under some assumption about the relative frequencies of different inputs.
The average-case helps to understand how good or bad an algorithm is in general, over all instances. However, it often requires domain knowledge and usually becomes too complex to calculate.
We will focus on the worst-case analysis as it is generally easy to do, and usually reflects the average case.
So, assume I am asking for worst-case analysis unless otherwise specified!
To focus on worst-case may seem draconian: what if an algorithm performs well in most cases except for a few pathological inputs? This certainly will be true for some algorithms, but in general, the worst-case analysis has been found to do a reasonable job of capturing algorithm efficiency in practice.
Resources
- Stackoverflow, Can any one simply explain me what is meant by Best, worst and average case running times of an algorithm ???
- Wikipedia entry on Average-case complexity.
- Steven J. Zeil lecture notes on Analysis of Algorithms: Average Case Analysis.