Big O Notation

About

In computer science, complexity is a measure of the resources required for an algorithm to solve a problem. The two most commonly analyzed types of complexity are:

  1. Time Complexity: How the runtime of an algorithm increases with the size of the input.

  2. Space Complexity: How the memory usage of an algorithm increases with the size of the input.

Both time and space complexity are often expressed using Big O notation, which describes the upper bound of an algorithm's growth rate.

When we say "algorithm X is asymptotically more efficient than algorithm Y", we are comparing the growth rates of their time or space complexity as the size of the input (N) becomes very large (approaches infinity). The growth rate of X's runtime (or space usage) is smaller than Y's runtime (or space usage) as N→∞. So X will always be a better choice for large inputs.

Comparing Different Complexities

The following table compares the growth of various time complexities with different input sizes n:

n

O(1)

O(log n)

O(n)

O(n log n)

O(n²)

O(n³)

O(2ⁿ)

1

1

0

1

0

1

1

2

10

1

1

10

10

100

1000

1024

100

1

2

100

200

10,000

1,000,000

1.27e30

1,000

1

3

1,000

3,000

1,000,000

1.0e9

1.07e301

10,000

1

4

10,000

40,000

1.0e8

1.0e12

-

100,000

1

5

100,000

500,000

1.0e10

1.0e15

-

1,000,000

1

6

1,000,000

6,000,000

1.0e12

1.0e18

-

Amortized Time Complexity

Time Complexity

Constant Time - O(1)

An algorithm runs in constant time if its runtime does not change with the input size.

Example: Accessing an array element by index.

int getElement(int[] arr, int index) {
    return arr[index]; // O(1)
}

Logarithmic Time - O(log n)

An algorithm runs in logarithmic time if its runtime grows logarithmically with the input size. These algorithms reduce the problem size by a fraction (typically half) at each step. This means that as the input size increases, the number of steps needed grows logarithmically rather than linearly.

What is the base of log used here ?

All logarithmic functions with different bases can be represented as O(log(n)) in Big O notation.

Example: Binary search.

int binarySearch(int[] arr, int target) {
    int left = 0, right = arr.length - 1;
    while (left <= right) {
        int mid = left + (right - left) / 2;
        if (arr[mid] == target) return mid;
        if (arr[mid] < target) left = mid + 1;
        else right = mid - 1;
    }
    return -1; // O(log n)
}

Logarithmic Growth

For an array of size n, the number of times you can halve the array before you are left with a single element is log⁡2(n). This is why the time complexity of binary search is O(log n).

  • For n=16, the steps are:

    • Step 1: 16 elements

    • Step 2: 8 elements

    • Step 3: 4 elements

    • Step 4: 2 elements

    • Step 5: 1 element

    • Total steps: 5 (which is approximately log⁡2(16))

Linear Time - O(n)

An algorithm runs in linear time if its runtime grows linearly with the input size.

Example: Finding the maximum element in an array.

int findMax(int[] arr) {
    int max = arr[0];
    for (int i = 1; i < arr.length; i++) {
        if (arr[i] > max) {
            max = arr[i];
        }
    }
    return max; // O(n)
}

Linearithmic Time - O(n log n)

An algorithm runs in linearithmic time if its runtime grows in proportion to nlog⁡n. It describes algorithms whose running time grows linearly with the size of the input 𝑛 n but also includes an additional logarithmic factor

Example: Efficient sorting algorithms like Merge Sort and Quick Sort.

 void mergeSort(int[] arr, int left, int right) {
        if (left < right) {
            int mid = (left + right) / 2;
            mergeSort(arr, left, mid);
            mergeSort(arr, mid + 1, right);
            merge(arr, left, mid, right);
        }
    }

    void merge(int[] arr, int left, int mid, int right) {
        int n1 = mid - left + 1;
        int n2 = right - mid;

        int[] leftArr = new int[n1];
        int[] rightArr = new int[n2];

        // Copy data to temp arrays
        System.arraycopy(arr, left, leftArr, 0, n1);
        System.arraycopy(arr, mid + 1, rightArr, 0, n2);

        int i = 0, j = 0, k = left;

        // Merge the temporary arrays back into arr
        while (i < n1 && j < n2) {
            if (leftArr[i] <= rightArr[j]) {
                arr[k] = leftArr[i];
                i++;
            } else {
                arr[k] = rightArr[j];
                j++;
            }
            k++;
        }

        // Copy remaining elements of leftArr[]
        while (i < n1) {
            arr[k] = leftArr[i];
            i++;
            k++;
        }

        // Copy remaining elements of rightArr[]
        while (j < n2) {
            arr[k] = rightArr[j];
            j++;
            k++;
        }
    }

    public static void main(String[] args) {
        int[] arr = {12, 11, 13, 5, 6, 7};
        MergeSortExample sorter = new MergeSortExample();
        System.out.println("Original array: " + Arrays.toString(arr));
        sorter.mergeSort(arr, 0, arr.length - 1);
        System.out.println("Sorted array: " + Arrays.toString(arr));
    }

Linearithmic Growth

For an array of size n, the total time to sort the array is the number of levels of division (logarithmic) multiplied by the time to process each level (linear).

  • Levels of Division: log⁡n

  • Processing Each Level: n

  • Total Time Complexity: nlog⁡n

Polynomial (Quadratic Time) - O(n²)

An algorithm runs in quadratic time if its runtime grows proportionally to the square of the input size.

Example: Simple sorting algorithms like Bubble Sort, Selection Sort.

void bubbleSort(int[] arr) {
    int n = arr.length;
    for (int i = 0; i < n - 1; i++) {
        for (int j = 0; j < n - 1 - i; j++) {
            if (arr[j] > arr[j + 1]) {
                int temp = arr[j];
                arr[j] = arr[j + 1];
                arr[j + 1] = temp;
            }
        }
    } // O(n²)
}

Polynomial (Cubic Time) - O(n³)

An algorithm runs in cubic time if its runtime grows proportionally to the cube of the input size.

Example: Certain dynamic programming algorithms.

void exampleCubic(int n) {
    for (int i = 0; i < n; i++) {
        for (int j = 0; j < n; j++) {
            for (int k = 0; k < n; k++) {
                // Some operations
            }
        }
    } // O(n³)
}

Super-Polynomial Growth (O(n^log⁡n)

It is between polynomial and exponential growth. Examples include algorithms involving combinatorics or recursion trees. Significant growth—slower than exponential but faster than any polynomial.

public class SuperPolynomialExample {
    public static void main(String[] args) {
        int n = 10; // Input size
        superPolynomialAlgorithm(n);
    }

    public static void superPolynomialAlgorithm(int n) {
        for (int i = 1; i <= n; i++) {
            for (int j = 1; j <= Math.pow(i, Math.log(n)); j++) {
                System.out.println("i: " + i + ", j: " + j);
            }
        }
    }
}

Exponential Time - O(2ⁿ)

An algorithm runs in exponential time if its runtime doubles with each additional input element. Example: Solving the traveling salesman problem using brute force.

int tsp(int[][] graph, boolean[] visited, int currPos, int n, int count, int cost, int ans) {
    if (count == n && graph[currPos][0] > 0) {
        return Math.min(ans, cost + graph[currPos][0]);
    }
    for (int i = 0; i < n; i++) {
        if (!visited[i] && graph[currPos][i] > 0) {
            visited[i] = true;
            ans = tsp(graph, visited, i, n, count + 1, cost + graph[currPos][i], ans);
            visited[i] = false;
        }
    }
    return ans; // O(2ⁿ)
}

Space Complexity

Constant Space - O(1)

An algorithm uses constant space if the amount of memory it requires does not change with the input size. Example: Swapping two variables.

void swap(int[] arr, int i, int j) {
    int temp = arr[i];
    arr[i] = arr[j];
    arr[j] = temp; // O(1)
}

Linear Space - O(n)

An algorithm uses linear space if the amount of memory it requires grows linearly with the input size. Example: Creating a copy of an array.

int[] copyArray(int[] arr) {
    int[] copy = new int[arr.length];
    for (int i = 0; i < arr.length; i++) {
        copy[i] = arr[i];
    }
    return copy; // O(n)
}

Quadratic Space - O(n²)

An algorithm uses quadratic space if the amount of memory it requires grows proportionally to the square of the input size. Example: Creating a 2D array.

int[][] create2DArray(int n) {
    int[][] array = new int[n][n];
    // Initialize array
    return array; // O(n²)
}

Logarithmic Space - O(log n)

An algorithm uses logarithmic space if the amount of memory it requires grows logarithmically with the input size. Example: Recursive algorithms that divide the problem in half at each step.

void recursiveLogarithmic(int n) {
    if (n <= 1) return;
    recursiveLogarithmic(n / 2);
} // O(log n)

Comparison

Searching Algorithms

Data Structure Operations

Array Sorting Algorithms

Last updated

Was this helpful?