Sorting
1. Bubble Sort
Description: Bubble Sort repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order. This process is repeated until the list is sorted.
Time Complexity:
Best Case: O(n) (when the array is already sorted)
Average Case: O(n^2)
Worst Case: O(n^2)
Space Complexity: O(1) (in-place sort)
Use Cases:
Simple and easy to implement but is not very efficient for large datasets.
Useful for small datasets or educational purposes to understand sorting concepts
Example
public class BubbleSort {
void bubbleSort(int arr[]) {
int n = arr.length;
for (int i = 0; i < n - 1; i++) {
for (int j = 0; j < n - i - 1; j++) {
if (arr[j] > arr[j + 1]) {
// swap arr[j+1] and arr[j]
int temp = arr[j];
arr[j] = arr[j + 1];
arr[j + 1] = temp;
}
}
}
}
}
2. Selection Sort
Description: Selection sort is a sorting technique that divides the unordered list into two sub-arrays: sorted and unsorted. In each pass, it finds the minimum (or maximum for descending order) element in the unsorted sub-array and swaps it with the first element of the unsorted sub-array. This process continues until the entire list is sorted.
Time Complexity:
Best Case: O(n^2)
Average Case: O(n^2)
Worst Case: O(n^2)
Space Complexity: O(1) (in-place sort)
Use Cases:
Similar to bubble sort, the number of comparisons grows quadratically with the size of the array. Selection sort is inefficient for large datasets.
Useful for small datasets
Example
public class SelectionSort {
void selectionSort(int arr[]) {
int n = arr.length;
for (int i = 0; i < n - 1; i++) {
int minIdx = i;
for (int j = i + 1; j < n; j++) {
if (arr[j] < arr[minIdx]) {
minIdx = j;
}
}
// Swap the found minimum element with the first element
int temp = arr[minIdx];
arr[minIdx] = arr[i];
arr[i] = temp;
}
}
}
3. Insertion Sort
Description: Insertion Sort builds the final sorted array one item at a time. It removes an element from the input data, finds the location it belongs to within the sorted list, and inserts it there.
Time Complexity:
Best Case: O(n) (when the array is already sorted)
Average Case: O(n^2)
Worst Case: O(n^2)
Space Complexity: O(1) (in-place sort)
Use Cases:
Efficient for small or partially sorted datasets due to its linear average-case time complexity.
In-place sorting algorithm, meaning it sorts the data within the original array without requiring additional memory allocation.
Adaptive: efficient for data that is already substantially sorted
Used in hybrid sorting algorithms like Timsort
Becomes inefficient for large datasets due to its worst-case quadratic time complexity.
Not as efficient as Merge Sort or Quick Sort for general sorting purposes.
Example
public class InsertionSort {
void insertionSort(int arr[]) {
int n = arr.length;
for (int i = 1; i < n; ++i) {
int key = arr[i];
int j = i - 1;
// Move elements of arr[0..i-1], that are greater than key,
// to one position ahead of their current position
while (j >= 0 && arr[j] > key) {
arr[j + 1] = arr[j];
j = j - 1;
}
arr[j + 1] = key;
}
}
}
4. Merge Sort
Description: Merge Sort is a divide and conquer algorithm that splits the list into halves, recursively sorts each half, and then merges the sorted halves.
Time Complexity:
Best Case: O(nlogn)
Average Case: O(nlogn)
Worst Case: O(nlogn)
Space Complexity: O(n) (not in-place)
Use Cases:
Sorting linked lists
External sorting (e.g., sorting large data sets stored on disk)
Example
public class MergeSort {
public static void mergeSort(int[] arr, int left, int right) {
if (left < right) {
// Find the middle point
int mid = left + (right - left) / 2;
// Sort first and second halves
mergeSort(arr, left, mid);
mergeSort(arr, mid + 1, right);
// Merge the sorted halves
merge(arr, left, mid, right);
}
}
private static void merge(int[] arr, int left, int mid, int right) {
int n1 = mid - left + 1;
int n2 = right - mid;
// Create temporary sub-arrays
int[] leftArray = new int[n1];
int[] rightArray = new int[n2];
// Copy data to temporary arrays
System.arraycopy(arr, left, leftArray, 0, n1);
System.arraycopy(arr, mid + 1, rightArray, 0, n2);
// Merge the temporary arrays
/* Iterate through helper array. Compare the left and right half, copying back
the smaller element from the two halves into the original array. */
int i = 0, j = 0, k = left;
while (i < n1 && j < n2) {
if (leftArray[i] <= rightArray[j]) {
arr[k] = leftArray[i];
i++;
} else {
arr[k] = rightArray[j];
j++;
}
k++;
}
// Copy the remaining elements
while (i < n1) {
arr[k] = leftArray[i];
i++;
k++;
}
while (j < n2) {
arr[k] = rightArray[j];
j++;
k++;
}
}
public static void main(String[] args) {
int[] arr = {6, 5, 3, 1, 8, 7, 2, 4};
mergeSort(arr, 0, arr.length - 1);
System.out.println("Sorted array:");
for (int i : arr) {
System.out.print(i + " ");
}
}
}
5. Quick Sort
Description: Quick Sort is a divide and conquer algorithm that picks an element as a pivot, partitions the array around the pivot, and recursively sorts the partitions.
Time Complexity:
Best Case: O(nlogn)
Average Case: O(nlogn)
Worst Case: O(n^2) (rare, can be mitigated with good pivot selection strategies like randomized or median-of-three). If we repeatedly partition the array (and its sub-arrays) around an element, the array will eventually become sorted. However, as the partitioned element is not guaranteed to be the median (or anywhere near the median), our sorting could be very slow. This is the reason for the 0( n^2) worst case runtime.
Space Complexity: O(logn) (in-place)
Use Cases:
Generally faster in practice compared to other O(nlogn) algorithms
Used in systems and applications where average-case performance matters.
Efficient for large datasets due to its average O(n log n) time complexity.
In-place sorting algorithm, meaning it sorts the data within the original array without requiring significant additional memory allocation.
Pivot selection strategy can significantly impact performance.
Example
public class QuickSort {
public static void quickSort(int[] arr, int low, int high) {
if (low < high) {
// pi is partitioning index, arr[pi] is now at right place
int pi = partition(arr, low, high);
// Recursively sort elements before and after partition
quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1, high);
}
}
private static int partition(int[] arr, int low, int high) {
int pivot = arr[high];
int i = (low - 1); // index of smaller element
for (int j = low; j <= high - 1; j++) {
// If current element is smaller than the pivot
if (arr[j] < pivot) {
i++;
swap(arr, i, j);
}
}
swap(arr, i + 1, high);
return (i + 1);
}
private static void swap(int[] arr, int i, int j) {
int temp = arr[i];
arr[i] = arr[j];
arr[j] = temp;
}
public static void main(String[] args) {
int[] arr = {10, 7, 8, 9, 1, 5};
int n = arr.length;
quickSort(arr, 0, n - 1);
}
6. Heap Sort
Description: Heap Sort is a sorting technique that leverages the properties of a heap data structure to efficiently sort an array. It involves converting the input array into a max-heap (where the root element has the largest value), and then repeatedly removing the largest element (root) and placing it at the end of the sorted sub-array. This process continues until the entire array is sorted.
Time Complexity:
Best Case: O(nlogn)
Average Case: O(nlogn)
Worst Case: O(nlogn)
Space Complexity: O(1) (in-place)
Use Cases:
Situations where space complexity matters
Embedded systems where memory usage is constrained
Example
public class HeapSort {
public static void heapSort(int[] arr) {
int n = arr.length;
// Build a max heap
for (int i = n / 2 - 1; i >= 0; i--) {
heapify(arr, n, i);
}
// Extract one by one from heap
for (int i = n - 1; i > 0; i--) {
// Move current root to end
swap(arr, 0, i);
// call max heapify on the reduced heap
heapify(arr, i, 0);
}
}
private static void heapify(int[] arr, int n, int i) {
int largest = i; // Initialize largest as root
int left = 2 * i + 1; // left = 2*i + 1
int right = 2 * i + 2; // right = 2*i + 2
// If left child is larger than root
if (left < n && arr[left] > arr[largest])
largest = left;
// If right child is larger than largest so far
if (right < n && arr[right] > arr[largest])
largest = right;
// If largest is not root
if (largest != i) {
swap(arr, i, largest);
// Recursively heapify the affected sub-tree
heapify(arr, n, largest);
}
}
private static void swap(int[] arr, int i, int j) {
int temp = arr[i];
arr[i] = arr[j];
arr[j] = temp;
}
public static void main(String[] args) {
int[] arr = {12, 11, 13, 5, 6, 7};
int n = arr.length;
heapSort(arr);
System.out.println("Sorted array:");
for (int i : arr) System.out.print(i + " ");
}
}
7. Radix Sort
Description: Radix sort is a non-comparative sorting technique that works efficiently for integers by sorting digits individually, from the least significant digit (LSD) to the most significant digit (MSD). It leverages the concept of place value to group elements based on their digits and then sorts them within those groups.
Time Complexity:
Best Case: O(nk)
Average Case: O(nk)
Worst Case: O(nk)
Space Complexity: O(n+k)
Use Cases:
Sorting integers or strings
Useful for large datasets where comparison-based sorting algorithms are inefficient
Efficient for sorting integers with a fixed number of digits or limited range.
Stable sorting algorithm, meaning it preserves the original order of equal elements.
Non-comparative sorting, offering advantages for specific hardware architectures.
Not as efficient for sorting strings or data with variable-length keys.
Requires additional space for buckets during sorting.
Example
public class RadixSort {
public static void radixSort(int[] arr) {
int max = getMax(arr); // Find the maximum value
for (int exp = 1; max / exp > 0; exp *= 10) {
countSort(arr, exp);
}
}
private static void countSort(int[] arr, int exp) {
int n = arr.length;
int[] output = new int[n]; // Output array
int[] count = new int[10]; // Count array for digits
// Store count of occurrences in count[]
for (int i = 0; i < n; i++) {
count[(arr[i] / exp) % 10]++;
}
// Cumulative count
for (int i = 1; i < 10; i++) {
count[i] += count[i - 1];
}
// Build the output array
for (int i = n - 1; i >= 0; i--) {
output[count[(arr[i] / exp) % 10] - 1] = arr[i];
count[(arr[i] / exp) % 10]--;
}
// Copy the sorted elements back to original array
System.arraycopy(output, 0, arr, 0, n);
}
private static int getMax(int[] arr) {
int max = arr[0];
for (int i = 1; i < arr.length; i++) {
if (arr[i] > max) {
max = arr[i];
}
}
return max;
}
public static void main(String[] args) {
int[] arr = {170, 45, 75, 90, 802, 24, 2, 66};
radixSort(arr);
System.out.println("Sorted array:");
for (int i : arr) System.out.print(i + " ");
}
}
8. Bucket Sort
Description: Bucket sort is a sorting technique that divides the elements into a fixed number of buckets (or bins) based on a specific range or hash function. Each bucket is then sorted individually using any suitable sorting algorithm like insertion sort. Finally, the sorted buckets are concatenated to form the final sorted array.
Time Complexity:
Best Case: O(n+k)
Average Case: O(n+k))
Worst Case: O(n^2)
Space Complexity: O(n+k)
Use Cases:
Sorting uniformly distributed data
Useful when input is drawn from a known distribution
Efficient for data with a limited range or when elements can be distributed evenly across buckets.
Works well with other sorting algorithms for the individual buckets (e.g., insertion sort for small bucket sizes).
Performance depends heavily on the chosen number of buckets and the distribution of elements.
Might require additional space for the buckets array.
public class BucketSort {
void bucketSort(float arr[], int n) {
if (n <= 0)
return;
ArrayList<Float>[] bucket = new ArrayList[n];
for (int i = 0; i < n; i++) {
bucket[i] = new ArrayList<Float>();
}
for (int i = 0; i < n; i++) {
int bucketIndex = (int) arr[i] * n;
bucket[bucketIndex].add(arr[i]);
}
for (int i = 0; i < n; i++) {
Collections.sort(bucket[i]);
}
int index = 0;
for (int i = 0; i < n; i++) {
for (int j = 0; j < bucket[i].size(); j++) {
arr[index++] = bucket[i].get(j);
}
}
}
}
9. Counting Sort
Description: Counting sort is a sorting algorithm that efficiently sorts elements when the range of possible values is limited. It works by creating a count array to store the frequency of each unique element in the original array. This count array is then used to place elements in their correct positions in the sorted output array.
Time Complexity:
Best Case: O(n+k)
Average Case: O(n+k)
Worst Case: O(n+k)
Space Complexity: O(k)
Use Cases:
Sorting integers within a small range
Efficient for datasets where range of the elements is not significantly larger than the number of elements
Not suitable for large ranges of possible values, as the count array size becomes significant.
Example
public class CountingSort {
public static void countingSort(int[] arr, int n) {
int max = arr[0];
for (int i = 1; i < n; i++) {
if (arr[i] > max) {
max = arr[i];
}
}
int[] count = new int[max + 1];
int[] output = new int[n];
// Store count of occurrences in count[]
for (int i = 0; i < n; i++) {
count[arr[i]]++;
}
// Prefix sum for cumulative count
for (int i = 1; i <= max; i++) {
count[i] += count[i - 1];
}
// Build the output array (stable placement)
for (int i = n - 1; i >= 0; i--) {
output[count[arr[i]] - 1] = arr[i];
count[arr[i]]--;
}
// Copy the sorted elements back to original array (optional)
if (arr == output) {
return;
}
for (int i = 0; i < n; i++) {
arr[i] = output[i];
}
}
public static void main(String[] args) {
int[] arr = {6, 1, 2, 3, 5, 4};
int n = arr.length;
countingSort(arr, n);
System.out.println("Sorted array:");
for (int i : arr) System.out.print(i + " ");
}
}
Comparison Table
Bubble Sort
O(n)
O(n2)
O(n2)O(n2)
O(1)O(1)
Small datasets, educational purposes
Selection Sort
O(n2)
O(n2)
O(n2)O(n2)
O(1)O(1)
Small datasets
Insertion Sort
O(n)
O(n2)
O(n2)
O(1)
Small datasets, adaptive sorting
Merge Sort
O(nlogn)
O(nlogn)
O(nlogn)
O(n)
Linked lists, external sorting
Quick Sort
O(nlogn)
O(nlogn)
O(n2)
O(logn)
General-purpose, fast average performance
Heap Sort
O(nlogn)
O(nlogn)
O(nlogn)
O(1)
Space-constrained environments
Radix Sort
O(nk)
O(nk)
O(nk)
O(n+k)
Large datasets, integer/strings
Bucket Sort
O(n+k)
O(n+k)
O(n2)
O(n+k)
Uniformly distributed data
Counting Sort
O(n+k)
O(n+k)
O(n+k)
O(k)
Small range integers
Last updated
Was this helpful?