Big O Notation

About

In computer science, complexity is a measure of the resources required for an algorithm to solve a problem. The two most commonly analyzed types of complexity are:

  1. Time Complexity: How the runtime of an algorithm increases with the size of the input.

  2. Space Complexity: How the memory usage of an algorithm increases with the size of the input.

Both time and space complexity are often expressed using Big O notation, which describes the upper bound of an algorithm's growth rate.

When we say "algorithm X is asymptotically more efficient than algorithm Y", we are comparing the growth rates of their time or space complexity as the size of the input (N) becomes very large (approaches infinity). The growth rate of X's runtime (or space usage) is smaller than Y's runtime (or space usage) as N→∞. So X will always be a better choice for large inputs.

Comparing Different Complexities

The following table compares the growth of various time complexities with different input sizes n:

n

O(1)

O(log n)

O(n)

O(n log n)

O(n²)

O(n³)

O(2ⁿ)

1

1

0

1

0

1

1

2

10

1

1

10

10

100

1000

1024

100

1

2

100

200

10,000

1,000,000

1.27e30

1,000

1

3

1,000

3,000

1,000,000

1.0e9

1.07e301

10,000

1

4

10,000

40,000

1.0e8

1.0e12

-

100,000

1

5

100,000

500,000

1.0e10

1.0e15

-

1,000,000

1

6

1,000,000

6,000,000

1.0e12

1.0e18

-

Amortized Time Complexity

Time Complexity

Constant Time - O(1)

An algorithm runs in constant time if its runtime does not change with the input size.

Example: Accessing an array element by index.

Logarithmic Time - O(log n)

An algorithm runs in logarithmic time if its runtime grows logarithmically with the input size. These algorithms reduce the problem size by a fraction (typically half) at each step. This means that as the input size increases, the number of steps needed grows logarithmically rather than linearly.

What is the base of log used here ?

All logarithmic functions with different bases can be represented as O(log(n)) in Big O notation.

Example: Binary search.

Logarithmic Growth

For an array of size n, the number of times you can halve the array before you are left with a single element is log⁡2(n). This is why the time complexity of binary search is O(log n).

  • For n=16, the steps are:

    • Step 1: 16 elements

    • Step 2: 8 elements

    • Step 3: 4 elements

    • Step 4: 2 elements

    • Step 5: 1 element

    • Total steps: 5 (which is approximately log⁡2(16))

Linear Time - O(n)

An algorithm runs in linear time if its runtime grows linearly with the input size.

Example: Finding the maximum element in an array.

Linearithmic Time - O(n log n)

An algorithm runs in linearithmic time if its runtime grows in proportion to nlog⁡n. It describes algorithms whose running time grows linearly with the size of the input 𝑛 n but also includes an additional logarithmic factor

Example: Efficient sorting algorithms like Merge Sort and Quick Sort.

Linearithmic Growth

For an array of size n, the total time to sort the array is the number of levels of division (logarithmic) multiplied by the time to process each level (linear).

  • Levels of Division: log⁡n

  • Processing Each Level: n

  • Total Time Complexity: nlog⁡n

Polynomial (Quadratic Time) - O(n²)

An algorithm runs in quadratic time if its runtime grows proportionally to the square of the input size.

Example: Simple sorting algorithms like Bubble Sort, Selection Sort.

Polynomial (Cubic Time) - O(n³)

An algorithm runs in cubic time if its runtime grows proportionally to the cube of the input size.

Example: Certain dynamic programming algorithms.

Super-Polynomial Growth (O(n^log⁡n)

It is between polynomial and exponential growth. Examples include algorithms involving combinatorics or recursion trees. Significant growth—slower than exponential but faster than any polynomial.

Exponential Time - O(2ⁿ)

An algorithm runs in exponential time if its runtime doubles with each additional input element. Example: Solving the traveling salesman problem using brute force.

Space Complexity

Constant Space - O(1)

An algorithm uses constant space if the amount of memory it requires does not change with the input size. Example: Swapping two variables.

Linear Space - O(n)

An algorithm uses linear space if the amount of memory it requires grows linearly with the input size. Example: Creating a copy of an array.

Quadratic Space - O(n²)

An algorithm uses quadratic space if the amount of memory it requires grows proportionally to the square of the input size. Example: Creating a 2D array.

Logarithmic Space - O(log n)

An algorithm uses logarithmic space if the amount of memory it requires grows logarithmically with the input size. Example: Recursive algorithms that divide the problem in half at each step.

Comparison

Searching Algorithms

Data Structure Operations

Array Sorting Algorithms

Last updated