[algorithm] How to find time complexity of an algorithm

The Question

How to find time complexity of an algorithm?

What have I done before posting a question on SO ?

I have gone through this, this and many other links

But no where I was able to find a clear and straight forward explanation for how to calculate time complexity.

What do I know ?

Say for a code as simple as the one below:

char h = 'y'; // This will be executed 1 time
int abc = 0; // This will be executed 1 time

Say for a loop like the one below:

for (int i = 0; i < N; i++) {        
    Console.Write('Hello World !');
}

int i=0; This will be executed only once. The time is actually calculated to i=0 and not the declaration.

i < N; This will be executed N+1 times

i++ ; This will be executed N times

So the number of operations required by this loop are

{1+(N+1)+N} = 2N+2

Note: This still may be wrong, as I am not confident about my understanding on calculating time complexity

What I want to know ?

Ok, so these small basic calculations I think I know, but in most cases I have seen the time complexity as

O(N), O(n2), O(log n), O(n!).... and many other,

Can anyone help me understand how does one calculate time complexity of an algorithm? I am sure there are plenty of newbies like me wanting to know this.

This question is related to algorithm time-complexity complexity-theory

The answer is


Taken from here - Introduction to Time Complexity of an Algorithm

1. Introduction

In computer science, the time complexity of an algorithm quantifies the amount of time taken by an algorithm to run as a function of the length of the string representing the input.

2. Big O notation

The time complexity of an algorithm is commonly expressed using big O notation, which excludes coefficients and lower order terms. When expressed this way, the time complexity is said to be described asymptotically, i.e., as the input size goes to infinity.

For example, if the time required by an algorithm on all inputs of size n is at most 5n3 + 3n, the asymptotic time complexity is O(n3). More on that later.

Few more Examples:

  • 1 = O(n)
  • n = O(n2)
  • log(n) = O(n)
  • 2 n + 1 = O(n)

3. O(1) Constant Time:

An algorithm is said to run in constant time if it requires the same amount of time regardless of the input size.

Examples:

  • array: accessing any element
  • fixed-size stack: push and pop methods
  • fixed-size queue: enqueue and dequeue methods

4. O(n) Linear Time

An algorithm is said to run in linear time if its time execution is directly proportional to the input size, i.e. time grows linearly as input size increases.

Consider the following examples, below I am linearly searching for an element, this has a time complexity of O(n).

int find = 66;
var numbers = new int[] { 33, 435, 36, 37, 43, 45, 66, 656, 2232 };
for (int i = 0; i < numbers.Length - 1; i++)
{
    if(find == numbers[i])
    {
        return;
    }
}

More Examples:

  • Array: Linear Search, Traversing, Find minimum etc
  • ArrayList: contains method
  • Queue: contains method

5. O(log n) Logarithmic Time:

An algorithm is said to run in logarithmic time if its time execution is proportional to the logarithm of the input size.

Example: Binary Search

Recall the "twenty questions" game - the task is to guess the value of a hidden number in an interval. Each time you make a guess, you are told whether your guess is too high or too low. Twenty questions game implies a strategy that uses your guess number to halve the interval size. This is an example of the general problem-solving method known as binary search

6. O(n2) Quadratic Time

An algorithm is said to run in quadratic time if its time execution is proportional to the square of the input size.

Examples:

7. Some Useful links


When you're analyzing code, you have to analyse it line by line, counting every operation/recognizing time complexity, in the end, you have to sum it to get whole picture.

For example, you can have one simple loop with linear complexity, but later in that same program you can have a triple loop that has cubic complexity, so your program will have cubic complexity. Function order of growth comes into play right here.

Let's look at what are possibilities for time complexity of an algorithm, you can see order of growth I mentioned above:

  • Constant time has an order of growth 1, for example: a = b + c.

  • Logarithmic time has an order of growth LogN, it usually occurs when you're dividing something in half (binary search, trees, even loops), or multiplying something in same way.

  • Linear, order of growth is N, for example

    int p = 0;
    for (int i = 1; i < N; i++)
      p = p + 2;
    
  • Linearithmic, order of growth is n*logN, usually occurs in divide and conquer algorithms.

  • Cubic, order of growth N^3, classic example is a triple loop where you check all triplets:

    int x = 0;
    for (int i = 0; i < N; i++)
       for (int j = 0; j < N; j++)
          for (int k = 0; k < N; k++)
              x = x + 2
    
  • Exponential, order of growth 2^N, usually occurs when you do exhaustive search, for example check subsets of some set.


I know this question goes a way back and there are some excellent answers here, nonetheless I wanted to share another bit for the mathematically-minded people that will stumble in this post. The Master theorem is another usefull thing to know when studying complexity. I didn't see it mentioned in the other answers.


Although there are some good answers for this question. I would like to give another answer here with several examples of loop.

  • O(n): Time Complexity of a loop is considered as O(n) if the loop variables is incremented / decremented by a constant amount. For example following functions have O(n) time complexity.

    // Here c is a positive integer constant   
    for (int i = 1; i <= n; i += c) {  
        // some O(1) expressions
    }
    
    for (int i = n; i > 0; i -= c) {
        // some O(1) expressions
    }
    
  • O(n^c): Time complexity of nested loops is equal to the number of times the innermost statement is executed. For example the following sample loops have O(n^2) time complexity

    for (int i = 1; i <=n; i += c) {
       for (int j = 1; j <=n; j += c) {
          // some O(1) expressions
       }
    }
    
    for (int i = n; i > 0; i += c) {
       for (int j = i+1; j <=n; j += c) {
          // some O(1) expressions
    }
    

    For example Selection sort and Insertion Sort have O(n^2) time complexity.

  • O(Logn) Time Complexity of a loop is considered as O(Logn) if the loop variables is divided / multiplied by a constant amount.

    for (int i = 1; i <=n; i *= c) {
       // some O(1) expressions
    }
    for (int i = n; i > 0; i /= c) {
       // some O(1) expressions
    }
    

    For example Binary Search has O(Logn) time complexity.

  • O(LogLogn) Time Complexity of a loop is considered as O(LogLogn) if the loop variables is reduced / increased exponentially by a constant amount.

    // Here c is a constant greater than 1   
    for (int i = 2; i <=n; i = pow(i, c)) { 
       // some O(1) expressions
    }
    //Here fun is sqrt or cuberoot or any other constant root
    for (int i = n; i > 0; i = fun(i)) { 
       // some O(1) expressions
    }
    

One example of time complexity analysis

int fun(int n)
{    
    for (int i = 1; i <= n; i++)
    {
        for (int j = 1; j < n; j += i)
        {
            // Some O(1) task
        }
    }    
}

Analysis:

For i = 1, the inner loop is executed n times. For i = 2, the inner loop is executed approximately n/2 times. For i = 3, the inner loop is executed approximately n/3 times. For i = 4, the inner loop is executed approximately n/4 times. ……………………………………………………. For i = n, the inner loop is executed approximately n/n times.

So the total time complexity of the above algorithm is (n + n/2 + n/3 + … + n/n), Which becomes n * (1/1 + 1/2 + 1/3 + … + 1/n)

The important thing about series (1/1 + 1/2 + 1/3 + … + 1/n) is equal to O(Logn). So the time complexity of the above code is O(nLogn).


Ref: 1 2 3


O(n) is big O notation used for writing time complexity of an algorithm. When you add up the number of executions in an algoritm you'll get an expression in result like 2N+2, in this expression N is the dominating term(the term having largest effect on expression if its value increases or decreases). Now O(N) is the time comlexity while N is dominating term. Example

For i= 1 to n;
  j= 0;
while(j<=n);
  j=j+1;

here total number of executions for inner loop are n+1 and total number of executions for outer loop are n(n+1)/2, so total number of executions for whole algorithm are n+1+n(n+1/2) = (n^2+3n)/2. here n^2 is the dominating term so the time complexity for this algorithm is O(n^2)


Loosely speaking, time complexity is a way of summarising how the number of operations or run-time of an algorithm grows as the input size increases.

Like most things in life, a cocktail party can help us understand.

O(N)

When you arrive at the party, you have to shake everyone's hand (do an operation on every item). As the number of attendees N increases, the time/work it will take you to shake everyone's hand increases as O(N).

Why O(N) and not cN?

There's variation in the amount of time it takes to shake hands with people. You could average this out and capture it in a constant c. But the fundamental operation here --- shaking hands with everyone --- would always be proportional to O(N), no matter what c was. When debating whether we should go to a cocktail party, we're often more interested in the fact that we'll have to meet everyone than in the minute details of what those meetings look like.

O(N^2)

The host of the cocktail party wants you to play a silly game where everyone meets everyone else. Therefore, you must meet N-1 other people and, because the next person has already met you, they must meet N-2 people, and so on. The sum of this series is x^2/2+x/2. As the number of attendees grows, the x^2 term gets big fast, so we just drop everything else.

O(N^3)

You have to meet everyone else and, during each meeting, you must talk about everyone else in the room.

O(1)

The host wants to announce something. They ding a wineglass and speak loudly. Everyone hears them. It turns out it doesn't matter how many attendees there are, this operation always takes the same amount of time.

O(log N)

The host has laid everyone out at the table in alphabetical order. Where is Dan? You reason that he must be somewhere between Adam and Mandy (certainly not between Mandy and Zach!). Given that, is he between George and Mandy? No. He must be between Adam and Fred, and between Cindy and Fred. And so on... we can efficiently locate Dan by looking at half the set and then half of that set. Ultimately, we look at O(log_2 N) individuals.

O(N log N)

You could find where to sit down at the table using the algorithm above. If a large number of people came to the table, one at a time, and all did this, that would take O(N log N) time. This turns out to be how long it takes to sort any collection of items when they must be compared.

Best/Worst Case

You arrive at the party and need to find Inigo - how long will it take? It depends on when you arrive. If everyone is milling around you've hit the worst-case: it will take O(N) time. However, if everyone is sitting down at the table, it will take only O(log N) time. Or maybe you can leverage the host's wineglass-shouting power and it will take only O(1) time.

Assuming the host is unavailable, we can say that the Inigo-finding algorithm has a lower-bound of O(log N) and an upper-bound of O(N), depending on the state of the party when you arrive.

Space & Communication

The same ideas can be applied to understanding how algorithms use space or communication.

Knuth has written a nice paper about the former entitled "The Complexity of Songs".

Theorem 2: There exist arbitrarily long songs of complexity O(1).

PROOF: (due to Casey and the Sunshine Band). Consider the songs Sk defined by (15), but with

V_k = 'That's the way,' U 'I like it, ' U
U   = 'uh huh,' 'uh huh'

for all k.


This is an excellent article : http://www.daniweb.com/software-development/computer-science/threads/13488/time-complexity-of-algorithm

The below answer is copied from above (in case the excellent link goes bust)

The most common metric for calculating time complexity is Big O notation. This removes all constant factors so that the running time can be estimated in relation to N as N approaches infinity. In general you can think of it like this:

statement;

Is constant. The running time of the statement will not change in relation to N.

for ( i = 0; i < N; i++ )
     statement;

Is linear. The running time of the loop is directly proportional to N. When N doubles, so does the running time.

for ( i = 0; i < N; i++ ) {
  for ( j = 0; j < N; j++ )
    statement;
}

Is quadratic. The running time of the two loops is proportional to the square of N. When N doubles, the running time increases by N * N.

while ( low <= high ) {
  mid = ( low + high ) / 2;
  if ( target < list[mid] )
    high = mid - 1;
  else if ( target > list[mid] )
    low = mid + 1;
  else break;
}

Is logarithmic. The running time of the algorithm is proportional to the number of times N can be divided by 2. This is because the algorithm divides the working area in half with each iteration.

void quicksort ( int list[], int left, int right )
{
  int pivot = partition ( list, left, right );
  quicksort ( list, left, pivot - 1 );
  quicksort ( list, pivot + 1, right );
}

Is N * log ( N ). The running time consists of N loops (iterative or recursive) that are logarithmic, thus the algorithm is a combination of linear and logarithmic.

In general, doing something with every item in one dimension is linear, doing something with every item in two dimensions is quadratic, and dividing the working area in half is logarithmic. There are other Big O measures such as cubic, exponential, and square root, but they're not nearly as common. Big O notation is described as O ( <type> ) where <type> is the measure. The quicksort algorithm would be described as O ( N * log ( N ) ).

Note that none of this has taken into account best, average, and worst case measures. Each would have its own Big O notation. Also note that this is a VERY simplistic explanation. Big O is the most common, but it's also more complex that I've shown. There are also other notations such as big omega, little o, and big theta. You probably won't encounter them outside of an algorithm analysis course. ;)


Time complexity with examples

1 - Basic Operations (arithmetic, comparisons, accessing array’s elements, assignment) : The running time is always constant O(1)

Example :

read(x)                               // O(1)
a = 10;                               // O(1)
a = 1.000.000.000.000.000.000         // O(1)

2 - If then else statement: Only taking the maximum running time from two or more possible statements.

Example:

age = read(x)                               // (1+1) = 2
if age < 17 then begin                      // 1
      status = "Not allowed!";              // 1
end else begin
      status = "Welcome! Please come in";   // 1
      visitors = visitors + 1;              // 1+1 = 2
end;

So, the complexity of the above pseudo code is T(n) = 2 + 1 + max(1, 1+2) = 6. Thus, its big oh is still constant T(n) = O(1).

3 - Looping (for, while, repeat): Running time for this statement is the number of looping multiplied by the number of operations inside that looping.

Example:

total = 0;                                  // 1
for i = 1 to n do begin                     // (1+1)*n = 2n
      total = total + i;                    // (1+1)*n = 2n
end;
writeln(total);                             // 1

So, its complexity is T(n) = 1+4n+1 = 4n + 2. Thus, T(n) = O(n).

4 - Nested Loop (looping inside looping): Since there is at least one looping inside the main looping, running time of this statement used O(n^2) or O(n^3).

Example:

for i = 1 to n do begin                     // (1+1)*n  = 2n
   for j = 1 to n do begin                  // (1+1)n*n = 2n^2
       x = x + 1;                           // (1+1)n*n = 2n^2
       print(x);                            // (n*n)    = n^2
   end;
end;

Common Running Time

There are some common running times when analyzing an algorithm:

  1. O(1) – Constant Time Constant time means the running time is constant, it’s not affected by the input size.

  2. O(n) – Linear Time When an algorithm accepts n input size, it would perform n operations as well.

  3. O(log n) – Logarithmic Time Algorithm that has running time O(log n) is slight faster than O(n). Commonly, algorithm divides the problem into sub problems with the same size. Example: binary search algorithm, binary conversion algorithm.

  4. O(n log n) – Linearithmic Time This running time is often found in "divide & conquer algorithms" which divide the problem into sub problems recursively and then merge them in n time. Example: Merge Sort algorithm.

  5. O(n2) – Quadratic Time Look Bubble Sort algorithm!

  6. O(n3) – Cubic Time It has the same principle with O(n2).

  7. O(2n) – Exponential Time It is very slow as input get larger, if n = 1000.000, T(n) would be 21000.000. Brute Force algorithm has this running time.

  8. O(n!) – Factorial Time THE SLOWEST !!! Example : Travel Salesman Problem (TSP)

Taken from this article. Very well explained should give a read.


Examples related to algorithm

How can I tell if an algorithm is efficient? Find the smallest positive integer that does not occur in a given sequence Efficiently getting all divisors of a given number Peak signal detection in realtime timeseries data What is the optimal algorithm for the game 2048? How can I sort a std::map first by value, then by key? Finding square root without using sqrt function? Fastest way to flatten / un-flatten nested JSON objects Mergesort with Python Find common substring between two strings

Examples related to time-complexity

Find common substring between two strings Why is the time complexity of both DFS and BFS O( V + E ) How to find time complexity of an algorithm Hash table runtime complexity (insert, search and delete) matrix multiplication algorithm time complexity how to calculate binary search complexity What are the time complexities of various data structures? Complexities of binary tree traversals Time complexity of Euclid's Algorithm What does O(log n) mean exactly?

Examples related to complexity-theory

Differences between time complexity and space complexity? Determining complexity for recursive functions (Big O notation) How to find time complexity of an algorithm How can building a heap be O(n) time complexity? HashMap get/put complexity Big-oh vs big-theta Is log(n!) = T(n·log(n))? Time complexity of accessing a Python dict What are the differences between NP, NP-Complete and NP-Hard? What's the fastest algorithm for sorting a linked list?