Boštjan Cigan
9 min readJan 22, 2017

--

I have often wondered about calculating time complexities. There was always enough theory but never enough practical examples that connected to the theory I was listening to in my Algorithm classes.

One of the fields where I was stuck was calculating the time complexity of recursive algorithms. Sure, you know the methods, but you need to hit your head against the wall a few times before you understand on how to use them.

A time complexity of an algorithm is commonly expressed using big O notation, which excludes coefficients and lower order terms. It is commonly estimated by counting the number of elementary operations performed by the algorithm, where an elementary operation takes a fixed amount of time to perform. Thus the amount of time taken and the number of elementary operations performed by the algorithm differ by at most a constant factor.

Time complexity of recursive algorithms is a difficult thing to compute, but we do know two methods, one of them is the Master theorem and the other one is the Akra-Bazzi method. The later uses a more mathematical based approach. In this article I will not explain what big O notation is (I am assuming that the reader already knows it), I will only explain how to use both of these methods to calculate the time complexity of recursive algorithms.

THE MASTER THEOREM

The master theorem concerns recurrence relations of the form:

where

Where

  • is the size of the problem,
  • is the number of subproblems in the recursion,
  • is the size of each subproblem,
  • is the cost of the work done outside of the recursive calls (includes the cost of dividing the problem and the cost of merging the solutions of the subproblems)

That is the mathematical definition, let us put it in a more formal definition for calculating time complexity. Let

where

are constants. If the following applies

for some

then:

Now I have always loved these formal definitions, but let us give an example:

The recurrence relation fits our definition above. The constants are

and

Now we must ask ourselves the following:

Which operator fits in the middle. Our function is

therefore we can say that our function is

We check the exponent in

so our

Now let us insert the values in the upper equation:

Our operator here is therefore equals and we fall into condition two of the equation above. Our time complexity is therefore:

These are the kind of examples you get everywhere. But the problem is, what do you do when you get an algorithm in front of you? Let us check the following algorithm:

A simple recursive algorithm for the Master Method example.

We must write this algorithm into a recurrence relation of the form we have written above. How do we do that? Now let us list the things we must find in the code above:

  • find the recursion and number of times the recursion executes in each loop (number of subproblems in the recursion),
  • we must find the cost of work that is done outside of the recursion,
  • we must find the size of each subproblem

From the code above we can clearly see where the recursion is done, by calling the method algorithm in the for loop. We can see that that call is executed three times (for loop goes from 0 until it is smaller than 4) and that the size of each subproblem is

(the argument when we call the method recursively). So we can say that

Before the recursion call, another for loop is executed, and that is the work that is done outside the recursion call. That loop is executed n times so our work outside the recursion is n. Our function therefore equals n. So if we put it all together:

Now lets put it all together:

Let us ask ourselves the same thing as with the relation above:

Our operator is again equals and we fall into the second condition. Time complexity is therefore:

The master method is simple, but it has its flaws. If we have different recursions in one algorithm, we must resort to other methods for solving our time complexities.

THE AKRA BAZZI METHOD

The Akra Bazzi is used where the subproblems of our algorithm have substantially different sizes. Let:

The conditions for usage are:

are positive constants,

are constants between 0 and 1,

for some

for each

for each

If all those conditions are met, then we can say that:

where

solves the equation

Again, while I am fond of definitions, it is the examples that teach us the most. Let us take two different examples, one where the recurrence relation is set and one where we get an algorithm.

Here

our function

The constants suffice our conditions. The function is upper asymptotically bound with a polynom (for example

The functions

are constant and are therefore asymptotically bounded by

Our last condition states that our recurrence relation must be defined and final for the first six cases.

Let us find the values for

that solves the equation

The solution is

Now we can calculate the integral.

Note that I have omited some steps in calculating the value of the integral, it can be solved by using the substitution method

Now lets take on a practical example. Take a look at the following algorithm:

Simple example for the Akra Bazzi method.

The procedure that we will take is exactly the same as the one with the master theorem. We have three recursive calls in the function, but of two of them have the same subproblem size, so we will just multiply the two. The work done outside the recursion equals

(the condition where i is smaller than n square, so the loop is executed n square times). Our recurrence relation is therefore:

Let us set the equation:

We only took the numbers that had T in the upper equation and multiplied them by the times the recursion is repeated. The solution to the equation is

Now to solve it, we must insert it in the formula above:

Note that this was an easy one. Equations can get a lot tougher and knowledge of integrating functions is the basis here, so make sure you pay attention in your computer science calculus class.

REFERENCES

  1. Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein. Introduction to Algorithms, Second Edition. MIT Press and McGraw-Hill, 2001. ISBN 0–262–03293–7. Sections 4.3 (The master method) and 4.4 (Proof of the master theorem), pp. 73–90.
  2. Mohamad Akra, Louay Bazzi: On the solution of linear recurrence equations. Computational Optimization and Applications 10(2):195–210, 1998.

--

--