Time & Space complexity [Cheat Sheet]

In computer science, understanding the efficiency of algorithms is critical for developing effective and optimized applications. Two key metrics used to evaluate algorithmic efficiency are time complexity and space complexity. These metrics help programmers evaluate how an algorithm’s resource usage scales with input size, helping them write optimal and efficient code.

Let’s understand both of these metrics.

Time Complexity

Time Complexity also known as running time of an algorithm is the number of primitive operations executed on a particular input. It measures the amount of time an algorithm takes to complete as a function of the length of the input. It provides an upper bound on the time required for an algorithm to run, which is crucial for performance considerations.

Common Time Complexities

  • Constant Time, O(1): The execution time is constant and does not change with the input size.
  • Logarithmic Time, O(log n): The execution time grows logarithmically with the input size.
  • Linear Time, O(n): The execution time grows linearly with the input size.
  • Quadratic Time, O(n2): The execution time grows quadratically with the input size, making it significantly slower as the input size increases.
  • Exponential Time, O(2n): The execution time grows exponentially with the input size, which is often impractical for large inputs.
Space Complexity

Space Complexity is the total (extra) memory space required by the program for its execution in relation to the size of the input. The space complexity of an algorithm is the amount of memory space required to solve the computational problem. It helps in understanding the memory requirements and managing resources efficiently.

Common Space Complexities

  • Constant Space, O(1): The algorithm uses a fixed amount of space regardless of the input size.
  • Linear Space, O(n): The memory usage grows linearly with the input size.
Big-O Complexity chart

Sorting Algorithms

AlgorithmTime Complexity

Space Complexity

BestAverageWorstWorst
QuicksortΩ(n log(n))Θ(n log(n))O(n2)O(log(n))
MergesortΩ(n log(n))Θ(n log(n))O(n log(n))O(n)
TimsortΩ(n)Θ(n log(n))O(n log(n))O(n)
HeapsortΩ(n log(n))Θ(n log(n))O(n log(n))O(1)
Bubble SortΩ(n)Θ(n2)O(n2)O(1)
Insertion SortΩ(n)Θ(n2)O(n2)O(1)
Selection SortΩ(n2)Θ(n2)O(n2)O(1)
Tree SortΩ(n log(n))Θ(n log(n))O(n2)O(n)
Shell SortΩ(n log(n))Θ(n(log(n))2)O(n(log(n))2)O(1)
Bucket SortΩ(n+k)Θ(n+k)O(n2)O(n)
Radix SortΩ(nk)Θ(nk)O(nk)O(n+k)
Counting SortΩ(n+k)Θ(n+k)O(n+k)O(k)
CubesortΩ(n)Θ(n log(n))O(n log(n))O(n)

Data Structure Operations

Data Structure


Time Complexity



Space Complexity

      Average



Worst



Worst

AccessSearchInsertionDeletionAccessSearchInsertionDeletion
ArrayΘ(1)Θ(n)Θ(n)Θ(n)O(1)O(n)O(n)O(n)O(n)
StackΘ(n)Θ(n)Θ(1)Θ(1)O(n)O(n)O(1)O(1)O(n)
QueueΘ(n)Θ(n)Θ(1)Θ(1)O(n)O(n)O(1)O(1)O(n)
Singly-Linked ListΘ(n)Θ(n)Θ(1)Θ(1)O(n)O(n)O(1)O(1)O(n)
Doubly-Linked ListΘ(n)Θ(n)Θ(1)Θ(1)O(n)O(n)O(1)O(1)O(n)
Skip ListΘ(log(n))Θ(log(n))Θ(log(n))Θ(log(n))O(n)O(n)O(n)O(n)O(n log(n))
Hash TableN/AΘ(1)Θ(1)Θ(1)N/AO(n)O(n)O(n)O(n)
Binary Search TreeΘ(log(n))Θ(log(n))Θ(log(n))Θ(log(n))O(n)O(n)O(n)O(n)O(n)
Cartesian TreeN/AΘ(log(n))Θ(log(n))Θ(log(n))N/AO(n)O(n)O(n)O(n)
B-TreeΘ(log(n))Θ(log(n))Θ(log(n))Θ(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(n)
Red-Black TreeΘ(log(n))Θ(log(n))Θ(log(n))Θ(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(n)
Splay TreeN/AΘ(log(n))Θ(log(n))Θ(log(n))N/AO(log(n))O(log(n))O(log(n))O(n)
AVL TreeΘ(log(n))Θ(log(n))Θ(log(n))Θ(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(n)
KD TreeΘ(log(n))Θ(log(n))Θ(log(n))Θ(log(n))O(n)O(n)O(n)O(n)O(n)

Graph algorithms

Algorithm       Time
Complexity

Space Complexity

Average Worst Worst
DFS (Depth-first Search) O(V + E) O(V + E) O(V)
BFS (Breadth-First Search) O(V + E) O(V + E) O(V)
Djiskstra (shortest path) O(V+E)*log(V)O(V+E)*log(V) O(V)
Bellman-ford (Shortest Path) O(V*E) O(V*E) O(V)
Kruskal’s minimum spanning tree O(E*logV) O(E*logV) O(|E|+|V|)
Prim’s minimum spanning tree  (using adjacency Matrix) O(V2) O(V2) O(1)

Understanding and applying the concepts of time and space complexity are fundamental to writing efficient algorithms. Evaluating these metrics, help write code that scales well, ensuring performance and robustness.

Reference: Big-O Cheatsheet

Leave a Reply

Your email address will not be published. Required fields are marked *