[data-structures] What is the time complexity of indexing, inserting and removing from common data structures?

There is no summary available of the big O notation for operations on the most common data structures including arrays, linked lists, hash tables etc.

This question is related to data-structures complexity-theory big-o

The answer is


Amortized Big-O for hashtables:

  • Insert - O(1)
  • Retrieve - O(1)
  • Delete - O(1)

Note that there is a constant factor for the hashing algorithm, and the amortization means that actual measured performance may vary dramatically.


Nothing as useful as this: Common Data Structure Operations:

enter image description here


Red-Black trees:

  • Insert - O(log n)
  • Retrieve - O(log n)
  • Delete - O(log n)

I guess I will start you off with the time complexity of a linked list:

Indexing---->O(n)
Inserting / Deleting at end---->O(1) or O(n)
Inserting / Deleting in middle--->O(1) with iterator O(n) with out

The time complexity for the Inserting at the end depends if you have the location of the last node, if you do, it would be O(1) other wise you will have to search through the linked list and the time complexity would jump to O(n).


Keep in mind that unless you're writing your own data structure (e.g. linked list in C), it can depend dramatically on the implementation of data structures in your language/framework of choice. As an example, take a look at the benchmarks of Apple's CFArray over at Ridiculous Fish. In this case, the data type, a CFArray from Apple's CoreFoundation framework, actually changes data structures depending on how many objects are actually in the array - changing from linear time to constant time at around 30,000 objects.

This is actually one of the beautiful things about object-oriented programming - you don't need to know how it works, just that it works, and the 'how it works' can change depending on requirements.


Information on this topic is now available on Wikipedia at: Search data structure

+----------------------+----------+------------+----------+--------------+
|                      |  Insert  |   Delete   |  Search  | Space Usage  |
+----------------------+----------+------------+----------+--------------+
| Unsorted array       | O(1)     | O(1)       | O(n)     | O(n)         |
| Value-indexed array  | O(1)     | O(1)       | O(1)     | O(n)         |
| Sorted array         | O(n)     | O(n)       | O(log n) | O(n)         |
| Unsorted linked list | O(1)*    | O(1)*      | O(n)     | O(n)         |
| Sorted linked list   | O(n)*    | O(1)*      | O(n)     | O(n)         |
| Balanced binary tree | O(log n) | O(log n)   | O(log n) | O(n)         |
| Heap                 | O(log n) | O(log n)** | O(n)     | O(n)         |
| Hash table           | O(1)     | O(1)       | O(1)     | O(n)         |
+----------------------+----------+------------+----------+--------------+

 * The cost to add or delete an element into a known location in the list 
   (i.e. if you have an iterator to the location) is O(1). If you don't 
   know the location, then you need to traverse the list to the location
   of deletion/insertion, which takes O(n) time. 

** The deletion cost is O(log n) for the minimum or maximum, O(n) for an
   arbitrary element.

Examples related to data-structures

Program to find largest and second largest number in array golang why don't we have a set datastructure How to initialize a vector with fixed length in R C compiling - "undefined reference to"? List of all unique characters in a string? Binary Search Tree - Java Implementation How to clone object in C++ ? Or Is there another solution? How to check queue length in Python Difference between "Complete binary tree", "strict binary tree","full binary Tree"? Write code to convert given number into words (eg 1234 as input should output one thousand two hundred and thirty four)

Examples related to complexity-theory

Differences between time complexity and space complexity? Determining complexity for recursive functions (Big O notation) How to find time complexity of an algorithm How can building a heap be O(n) time complexity? HashMap get/put complexity Big-oh vs big-theta Is log(n!) = T(n·log(n))? Time complexity of accessing a Python dict What are the differences between NP, NP-Complete and NP-Hard? What's the fastest algorithm for sorting a linked list?

Examples related to big-o

Differences between time complexity and space complexity? Determining complexity for recursive functions (Big O notation) What exactly does big ? notation represent? How to merge two sorted arrays into a sorted array? Time complexity of Euclid's Algorithm Are there any worse sorting algorithms than Bogosort (a.k.a Monkey Sort)? Append an object to a list in R in amortized constant time, O(1)? What does O(log n) mean exactly? Is log(n!) = T(n·log(n))? Difference between Big-O and Little-O Notation