Which sorting algorithm makes minimum number of swaps?

Sorting means arranging a set of data in an order. There’re various algorithms to sort the data (in a data structure) in increasing or decreasing order. These algorithms can be divided on basis of various factors:

  • Inplace sorting & External sorting

Inplace sorting means that all the data which is to be sorted can be accommodated at a time in memory. Examples of inplace sorting algorithms are Selection sort, Bubble sort etc. This means, it needs only O(1) extra space and have O(1) space complexity.

External sorting means the data that is to be sorted is partially present in memory and remaining is present in auxiliary (extra) memory. Merge sort is a good example of external sorting. Merge sort has O(n) space complexity, where n is the size of input.

  • Time Complexity

Time Complexity also known as running time of an algorithm is the number of primitive operations executed on a particular input.

Sorting algorithms like Selection sort, Insertion sort have worst case time complexity of O(n2) . Whereas, algorithms like Merge sort and Heap sort have worst case time complexity of O(nlog(n)).

Selection sort

Selection sort is an in-place comparison sorting algorithm. It has a worst case time complexity of O(n2). However, we’ve sorting algorithms like Merge sort and Heap sort which have a better time complexity ie. O(nlog(n)). But selection sort has performance advantages over above complicated algorithms in certain situations, particularly where auxiliary memory is limited.

One advantage that Selection sort has over other sorting algorithms is that it makes the minimum possible number of swaps, ie. n − 1 in the worst case, for an input of size n

Selection sort algorithm (Pseudocode)

Consider an integer array, arr of size n.

Selection sort (arr, n)
   
   for each i = 0 to n - 2:
      indexMin = i    
      for each j = i+1 to n-1: 
         if arr[j] < arr[indexMin]:
            indexMin = j;
         end
      end
      if indexMin != i:
         swap arr[indexMin] and arr[i]
      end
   end
	
end
Analysis of Selection Sort

As we could understand from the pseudocode that in each iteration from i = 0 to i = n – 2, there’s only one swap, at end of the iteration of first for loop (i = 0, n-2).

At the end of the each ith iteration of the first for loop, ith (smallest) number is placed at the ith place.

Code Implementation

//
//  main.cpp
//  Selection Sort
//
//  Created by Himanshu on 29/08/21.
//

#include <iostream>
using namespace std;

void printArray (int arr[], int n) {
    for (int i=0; i<n; i++) {
        cout<<arr[i]<<" ";
    }
    cout<<endl;
}

void selectionSort(int arr[], int n) {
    cout<<"Initial array:"<<endl;
    printArray(arr, n);
    
    cout<<"Selection sort"<<endl;
    
    for (int i=0; i<n-1; i++) {
        int indexMin = i;

        for (int j=i+1; j<n; j++) {
            if (arr[j] < arr[indexMin]) {
                indexMin = j;
            }
        }

        if (indexMin != i) {
            swap(arr[i], arr[indexMin]);
        }

        cout<<"Array after "<<(i+1)<<" iteration: "<<endl;
        printArray(arr, n);
    }
}

int main() {
    int arr[] = {21, 15, 30, 13 , 2};
    int n = 5;

    selectionSort(arr, n);

    return 0;
}

Output

Initial array:
21 15 30 13 2 
Selection sort
Array after 1 iteration: 
2 15 30 13 21 
Array after 2 iteration: 
2 13 30 15 21 
Array after 3 iteration: 
2 13 15 30 21 
Array after 4 iteration: 
2 13 15 21 30 

Here’s a working example: Selection Sort

Application of Selection sort
  • It is very useful when memory write is a costly operation.
  • Number of swaps (writes) can be a very important factor if writes are significantly more expensive than reads, such as with EEPROM or Flash memory, where each write reduces the lifespan of the Flash memory.

Leave a Reply

Your email address will not be published. Required fields are marked *