In computer science, understanding the efficiency of algorithms is critical for developing effective and optimized applications. Two key metrics used to evaluate algorithmic efficiency are time complexity and space complexity. These metrics help programmers evaluate how an algorithm’s resource usage scales with input size, helping them write optimal and efficient code.
Let’s understand both of these metrics.
Time Complexity
Time Complexity also known as running time of an algorithm is the number of primitive operations executed on a particular input. It measures the amount of time an algorithm takes to complete as a function of the length of the input. It provides an upper bound on the time required for an algorithm to run, which is crucial for performance considerations.
Common Time Complexities
- Constant Time, O(1): The execution time is constant and does not change with the input size.
- Logarithmic Time, O(log n): The execution time grows logarithmically with the input size.
- Linear Time, O(n): The execution time grows linearly with the input size.
- Quadratic Time, O(n2): The execution time grows quadratically with the input size, making it significantly slower as the input size increases.
- Exponential Time, O(2n): The execution time grows exponentially with the input size, which is often impractical for large inputs.
Space Complexity
Space Complexity is the total (extra) memory space required by the program for its execution in relation to the size of the input. The space complexity of an algorithm is the amount of memory space required to solve the computational problem. It helps in understanding the memory requirements and managing resources efficiently.
Common Space Complexities
- Constant Space, O(1): The algorithm uses a fixed amount of space regardless of the input size.
- Linear Space, O(n): The memory usage grows linearly with the input size.
Big-O Complexity chart
Sorting Algorithms
Algorithm | Time Complexity | Space Complexity | ||
Best | Average | Worst | Worst | |
Quicksort | Ω(n log(n)) | Θ(n log(n)) | O(n2) | O(log(n)) |
Mergesort | Ω(n log(n)) | Θ(n log(n)) | O(n log(n)) | O(n) |
Timsort | Ω(n) | Θ(n log(n)) | O(n log(n)) | O(n) |
Heapsort | Ω(n log(n)) | Θ(n log(n)) | O(n log(n)) | O(1) |
Bubble Sort | Ω(n) | Θ(n2) | O(n2) | O(1) |
Insertion Sort | Ω(n) | Θ(n2) | O(n2) | O(1) |
Selection Sort | Ω(n2) | Θ(n2) | O(n2) | O(1) |
Tree Sort | Ω(n log(n)) | Θ(n log(n)) | O(n2) | O(n) |
Shell Sort | Ω(n log(n)) | Θ(n(log(n))2) | O(n(log(n))2) | O(1) |
Bucket Sort | Ω(n+k) | Θ(n+k) | O(n2) | O(n) |
Radix Sort | Ω(nk) | Θ(nk) | O(nk) | O(n+k) |
Counting Sort | Ω(n+k) | Θ(n+k) | O(n+k) | O(k) |
Cubesort | Ω(n) | Θ(n log(n)) | O(n log(n)) | O(n) |
Data Structure Operations
Data Structure | Time Complexity | Space Complexity | |||||||
Average | Worst | Worst | |||||||
Access | Search | Insertion | Deletion | Access | Search | Insertion | Deletion | ||
Array | Θ(1) | Θ(n) | Θ(n) | Θ(n) | O(1) | O(n) | O(n) | O(n) | O(n) |
Stack | Θ(n) | Θ(n) | Θ(1) | Θ(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Queue | Θ(n) | Θ(n) | Θ(1) | Θ(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Singly-Linked List | Θ(n) | Θ(n) | Θ(1) | Θ(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Doubly-Linked List | Θ(n) | Θ(n) | Θ(1) | Θ(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Skip List | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(n) | O(n) | O(n) | O(n) | O(n log(n)) |
Hash Table | N/A | Θ(1) | Θ(1) | Θ(1) | N/A | O(n) | O(n) | O(n) | O(n) |
Binary Search Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(n) | O(n) | O(n) | O(n) | O(n) |
Cartesian Tree | N/A | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | N/A | O(n) | O(n) | O(n) | O(n) |
B-Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(n) |
Red-Black Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(n) |
Splay Tree | N/A | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | N/A | O(log(n)) | O(log(n)) | O(log(n)) | O(n) |
AVL Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(n) |
KD Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(n) | O(n) | O(n) | O(n) | O(n) |
Graph algorithms
Algorithm | Time Complexity | Space Complexity | |
Average | Worst | Worst | |
DFS (Depth-first Search) | O(V + E) | O(V + E) | O(V) |
BFS (Breadth-First Search) | O(V + E) | O(V + E) | O(V) |
Djiskstra (shortest path) | O(V+E)*log(V) | O(V+E)*log(V) | O(V) |
Bellman-ford (Shortest Path) | O(V*E) | O(V*E) | O(V) |
Kruskal’s minimum spanning tree | O(E*logV) | O(E*logV) | O(|E|+|V|) |
Prim’s minimum spanning tree (using adjacency Matrix) | O(V2) | O(V2) | O(1) |
Understanding and applying the concepts of time and space complexity are fundamental to writing efficient algorithms. Evaluating these metrics, help write code that scales well, ensuring performance and robustness.
Reference: Big-O Cheatsheet