Introduction to Asymptotic Analysis and Big O
Asymptotic analysis is a way to classify the running time complexity of algorithms.
We have seen that the time complexity of an algorithm can be expressed as a polynomial. To compare two algorithms, we can compare the respective polynomials. However, the analysis may look a bit cumbersome and would become intractable for bigger algorithms that we tend to encounter in practice.
Asymptotic Analysis
One observation that helps us is that we want to worry about large input sizes only. If the input size is really small, how bad can a poorly designed algorithm get, right? Mathematicians have a tool for this sort of analysis called the asymptotic notation. The asymptotic notation compares two functions, say, and for very large values of . This fits in nicely with our need for comparing algorithms for very large input sizes.
Big O Notation
One of the asymptotic notations is the Big O notation. A function is considered , read as big oh of , if there exists some positive real constant and an integer , such that the following inequality holds for all :
The following graph shows a plot of a function and that demonstrates this inequality.
Note that the above inequality does not have to hold for all . For , is allowed to exceed , but for all values of beyond some value , is not allowed to exceed .
What good is this? It tells us that for very large values of , will be at most within a constant factor of . In other words, will grow no faster than a constant multiple of . Put yet another way, the rate of growth of is within constant factors of that of .
People tend to write = , which isn’t technically accurate. A whole lot of functions can satisfy the conditions. Thus, is a set of functions. It is OK to say that belongs to .
Example
Let’s consider an algorithm whose running time is given by . Let us try to verify that this algorithm’s time complexity is in . To do this, we need to find a positive constant and an integer , such that for all :
The above inequality would still be true if we re-wrote it while replacing with . What we have done is replacement of the variable part in all terms with , the variable-part of the highest order term. This gives us:
This does not violate the inequality because each term on the right hand side is greater than the corresponding term on the left hand side. Now, comparing it with the definition of Big-O, we can pick c = 9.
That leaves . For what values of is the inequality satisfied? All of them, actually! So, we can pick .
The above solution is not unique. We could have picked any value for that exceeds the coefficient of the highest power of in . Suppose, we decided to pick . The reader can verify that the inequality still holds for or higher.
Note that it is not possible to find a constant and to show that is or . It is possible to show that is or or any higher power of . Mathematically, it is correct to say that is , but from a computer science point of view it is not very useful. It gives us a loose bound on the asymptotic running time of the algorithm. When dealing with time and space complexities, we are generally interested in the tightest possible bound when it comes to the asymptotic notation.
Suppose algorithm A and B have running time of and , respectively. For sufficiently large input sizes, algorithm A will run faster than algorithm B. That does not mean that algorithm A will always run faster than algorithm B.
Algorithm A and B both have running time . The execution time for these algorithms, for very large input sizes, will be within constant factors of each other. For all practical purposes, they are considered equally good.
Simplified Asymptotic Analysis
Once we have obtained the time complexity of an algorithm by counting the number of primitive operations, we can arrive at the Big O notation for the algorithm simply by:
- Dropping the multiplicative constants with all terms
- Dropping all but the highest order term
Thus, is while is .
A Comparison of Some Common Functions
It is easy to work with simple polynomials in , but when the time complexity involves other types of functions like , you may find it hard to identify the “highest order term”. The following table lists some commonly encountered functions in ascending order of rate of growth. Any function can be given as Big O of any other function that appears later in this table.
Function | Name |
---|---|
Any constant | Constant |
Logarithmic | |
Log-square | |
Root-n | |
Linear | |
Linearithmic | |
Quadratic | |
Cubic | |
Quartic | |
Exponential | |
Exponential | |
n-Factorial |
The following graph visually shows some of the functions from the above table.
Quick quiz on Big O!
is in
True
False