Algorithm Analysis
We usually determine the efficiency of an algorithm by analyzing its time and space complexity.
An input is a data structure that is given to an algorithm to solve a problem. The size of an input is the number of elements in the input.
How do we assess time complexity?
When we analyize time complexity, we are interested in the efficiency of an algorithm as a function of the size of the input.
As different machines have varying speeds of processors, time is not a good fit for measuring algorithm efficiency.
We can consider the number of operations an algorithm performs relative to the size of the input, which will isolate the algorithm from the performing machines.
However, different operations have varying efficiencies, so instead we consider the growth rate of the total number of operations as a function of the input size. The analysis of such is called Asymptotic analysis.
The rate of growth of the total number of operations an algorithm performs relative to size of input.
The rate of growth of the total amount of memory an algorithm uses relative to size of input.
Asymptotic notation
We are defining a certain function to represent the total number of operations with respect to (the size of the input).
When we say is of (or ), we are saying that the growth rate of is bounded by under a specific mathematical inequality defined by .
Note that the is interchangable with . The meaning does not really matter.
To prove non-strict boundaries, we need to show the existence of constants that satisfies the inequality.
Notation | Condition | Asymptotic boundary | Name |
---|---|---|---|
Upper | Big O | ||
Lower | Big Omega | ||
Tight | Big Theta / Order of growth | ||
Strictly upper | Little o | ||
Strictly lower | Little omega |
We use the likeness of to describe the complexity of . The following gives the common growth rate functions (increasing order of growth):
Note that the higher the growth rate, the more complex the algorithm.
Note the following growth rate equivilencies:
- (prove not needed, related to lower bound of sorting algorithms)
- as
Non-strict: Identify the term with the highest growth rate in would be .
Strict: Choose the next / previous growth rate function with reference to the list depending on if it's upper / lower.
You can use the same logic to determine if some is of .
Note that as the defintions are specified by an inequality, there could be multiple satisfying for the same and all of them are valid. However, keep in mind the only useful for analysis would be closest to the condition boundary.
If the function functuates, there is not a consistent growth rate unless a boundary is specified.
Big Theta of resursive relations
When tasked to find find the asymptotic notation of a recursive relation, we are required to give the Big Theta notation of the relation.
The straight forward way is to find the closed-form expression and identify the growth rate. However, this is not always possible.
Time complexity of code
Our goal is to examine how the runtime grows with respect to the input size. Consider the following example:
It should be acknowledged that each operation / line would take different amounts of time to compute. However, as we are interested in the runtime 's growth rate, we can treat all operations to take unit of time.
Hence,
Growth | Occurrence |
---|---|
Statements that run a set number of times with no respect to the input size | |
Algorithms that solve a big problem by transformation into a series of smaller problems, cutting the problem size by some constant fraction at each step. | |
"for" loops that have amount of iterations per program run | |
Algorithms that solve a problem by breaking it up into smaller sub-problems, solving them independently, and then combining the solutions. | |
Nested "for" loops that have amount of iterations per program run |