Algorithms Complexity: All You Should Know About It
Published: | at 09:22 PM
Introduction
Algorithm complexity is a measure of how long it would take an algorithm to complete given an input of size n. If an algorithm has to scale, it should complete the result within a finite and practical time bound even for large values of n. For this reason, complexity calculated asymptotically as n approaches infinity.
Calculation number of operations allows you to compare the efficiency of algorithms. As we’ve mentioned in the Introduction section, the analysis is carried out with the expectation of a sufficiently large amount of processed data where n→∞, therefore, the growth rate is a key of importance, not the exact number of operations.
At this moment, we are ready to deep dive into details. Let’s define the function classes and consider asymptotic notations that describe the running time for a particular program or algorithm:
O(n) - functions whose growth is faster than g
Ω(n) - functions whose growth is slower than g
Θ(n) - functions whose growth is the same as g
For a given function g(n) the notation Θ(g(n)) denotes: Θ(g(n))={f(n):∃c1≥0∃c2≥0∃n0≥0∣0≤c1g(n)≤f(n)c2g(n),∀n≥n0}
For a given function g(n) the notation O(g(n)) denotes: O(g(n))={f(n):∃c≥0∣0≤f(n)≤cg(n),∀n≥n0}
For a given function g(n) the notation Ω(g(n)) denotes: Ω(g(n))={f(n):∃c≥0∣0≤cg(n)≤f(n),∀n≥n0}
Comparing Functions
When we deal with complex algorithms, we are able to consider the limit of the ratio of the functions complexity:
limn→∞gnfn
Based on the limit result, we can calculate the growth rate of the function:
The limit is equal to constant and it’s not equal to zero. It means that the functions growth the same: f(n)=Θ(g(n))
The limit is equal to zero. It means that the g(n)’s growth is faster than f(n):f(n)=O(g(n))
The limit is equal to infinity. It means that the g(n)’s growth is slower than f(n):f(n)=Ω(g(n))
Also, we can consider a ratio of the derivatives of the function, like this:
limn→∞gnfn=gn′fn′
if functions have the following properties:
limn→∞=gnfn=00 or limn→∞=gnfn=∞∞
f(n) and g(n) are differentiable
g(n)′=0 when n→∞
∃limn→∞g(n)f(n)
Example
Compare two functions f(n)=nlog2n and g(n)=log2n2019
Let us have a recurrence relation of the following form:
T(n)={a∗T(bn)+O(nc),O(1)n > 1n = 1
where a∈N,b∈R,b>1,c∈R+. Then the solution of the given recurrence relation has the form:
If c>logba, then T(n)=Θ(n)
If c=logba, then T(n)=Θ(nclogbn)
If c<logba, then T(n)=Θ(nlogba)
Proof:
Let’s consider the recursion tree. It has logbn levels where each k-level has ak vertexes each of them costs (bkn)c.
Let’s summarize the cost of the tree at all levels:
T(n)=∑k=0logbnak∗(bkn)c=nc∑k=0logbn(bca)k
We get the tree cases:
If c>logba, then ∑(bca)k is the sum of a decreasing geometric progression (where bca<1), which doesn’t depend on n. It means that the sum is just a constant. T(n)=Θ(n)
If c=logba, then T(n)=nc∑k=0logbn(bca)k=nc∑k=0logbn1k=Θ(nclogbn)
If c<logba, then since the sum of a progression is asymptotically equivalent to its leading element, we have the following: T(n)=nc∑k=0logbn(bca)k=Θ(nc(bca)logbn)=Θ(nc∗ncalogbn)=Θ(alogbn)=Θ(nlogba)
Recurrence Relations
There are three main methods for solving recurrence relations: substitution method, recursion tree and master theorem. Let’s consider them.
Substitution method
In order to solve a recurrence relation using the substitution method it’s necessary to complete two steps:
Make an assumption about the solution;
Use the mathematical induction to prove the assumption’s correctness.
Example
Show that the solution of T(n)=T(⌈2n⌉)+1 is O(lgn)
Proof:
Assume that T(n)≤c∗lg(n−a) (we used a definition of big-O from the Asymptotic Notations part), where a is a number which will be selected later. Now, we are ready to substitute it into initial recurrence.
We proved that O(lgn) is a solution for the given relation when a≥1,c≥1.
Let’s see another example.
Show that the solution of T(n)=T(n−1)+n is O(n2)
Proof:
Assume, that T(n)≤c∗n2. Let’s substitute this inequality to a given recurrence.
Then,
T(n)≤c∗(n−1)2+n=cn2−2cn+c+n=cn2+n(1−2c)+c≤cn2
The last step completes when c≥21.
Recursion Tree
Another way to find a solution for relation recursion is using a Recursion Tree.
In this method, a recurrence relation is converted into recursive tree. Each node represents the cost incurred at various levels of recursion. To find the total cost, cost of all levels are summed up.
Example
Use a recursion tree to determine a good asymptotic upper bound on the recurrence T(n)=4T(2n+2)+n). Use the substitution method to verify your answer.
Proof:
The tree has log2n levels. The subproblem size for a node at depth i is 2in. The subproblem’s size reach 1 when 2in=1⇒i=log2n.
The last level of depth log2n contains 4log2n=n2.