Skip to main content

📏 Big O

Worst-case traffic time estimate

The Traffic Time Analogy

Estimating travel time:

  • O(1): "A fixed short time, no matter where you're going" - Teleportation!
  • O(log n): "A few more minutes per 10x distance" - Highway with no traffic
  • O(n): "Time proportional to distance" - Normal driving
  • O(n²): "Checking every route against every other" - Traffic jam maze

Big O notation describes how algorithms scale. It's not about exact time - it's about the pattern of growth.


What Big O Actually Means

O(n) doesn't mean "takes n milliseconds"
O(n) means "grows proportionally to n"

If n doubles, the time roughly doubles.
If n grows 10x, the time grows 10x.

We care about behavior at scale, not exact values.


The Common Big O Classes

O(1) - Constant

Same time regardless of input size:

Examples:
- Array access: arr[5]
- Hash table lookup: dict["key"]
- Push/pop from stack

n = 10 → 1 operation
n = 1,000,000 → still 1 operation

O(log n) - Logarithmic

Grows very slowly, halves the problem each step:

Examples:
- Binary search
- Balanced tree operations
- Finding in sorted data

n = 10 → ~3 operations
n = 1,000,000 → ~20 operations
n = 1,000,000,000 → ~30 operations

Mind-blowing: 1 billion items in 30 steps!

O(n) - Linear

Directly proportional to input:

Examples:
- Find max in unsorted array
- Single loop through data
- Print all elements

n = 10 → 10 operations
n = 1,000,000 → 1,000,000 operations

O(n log n) - Log-Linear

Slightly more than linear, typical for efficient sorting:

Examples:
- Merge sort
- Quick sort (average)
- Heap sort

n = 10 → ~33 operations
n = 1,000,000 → ~20,000,000 operations

O(n²) - Quadratic

Square of input, nested loops:

Examples:
- Bubble sort
- Check all pairs
- Simple matrix operations

n = 10 → 100 operations
n = 1,000 → 1,000,000 operations
n = 1,000,000 → 1,000,000,000,000 operations 💀

O(2^n) - Exponential

Doubles with each input element:

Examples:
- Generate all subsets
- Naive recursive Fibonacci
- Brute-force passwords

n = 10 → 1,024 operations
n = 20 → ~1,000,000 operations
n = 40 → ~1,000,000,000,000 operations 💀

O(n!) - Factorial

Grows insanely fast:

Examples:
- Generate all permutations
- Traveling salesman (brute force)

n = 10 → 3,628,800 operations
n = 20 → an enormous number of operations
n = 100 → larger than atoms in universe!

The Growth Comparison

Input:     10       100      1,000    10,000
------------------------------------------------
O(1)       1        1        1        1
O(log n)   3        7        10       13
O(n)       10       100      1,000    10,000
O(n log n) 33       700      10,000   130,000
O(n²)      100      10,000   1,000,000 100,000,000

See the pattern? Higher complexities explode!

Rules for Calculating Big O

Rule 1: Drop Constants

O(2n) → O(n)
O(500) → O(1)
O(n/2) → O(n)

Constants don't affect the growth pattern.

Rule 2: Drop Lower-Order Terms

O(n² + n) → O(n²)
O(n + log n) → O(n)
O(n³ + n² + n) → O(n³)

The dominant term wins at scale.

Rule 3: Add for Sequential Operations

Do A, then B:
O(A) + O(B)

Loop n times, then loop m times:
O(n) + O(m) = O(n + m)

Rule 4: Multiply for Nested Operations

For each A, do B:
O(A) Ă— O(B)

Nested loops over n:
O(n) × O(n) = O(n²)

Quick Recognition Patterns

PatternComplexity
Direct lookup, no loopsO(1)
Halving each stepO(log n)
Single pass through dataO(n)
Divide and processO(n log n)
Nested loops (n × n)O(n²)
All subsetsO(2^n)
All orderingsO(n!)

Space Complexity

Big O also applies to memory:

O(1) space: Fixed variables
     int total = 0;

O(n) space: Array sized by input
     new_array = make array of size n

O(n²) space: 2D matrix
     matrix = n Ă— n grid

Time-Space Tradeoffs

You often trade one for the other:

Example: Fibonacci

Space O(1), Time O(2^n): Naive recursion
Space O(n), Time O(n): Memoization
Space O(1), Time O(n): Iterative with two variables

Common Mistakes

1. Forgetting Hidden Complexity

for item in arr:        # O(n)
    if item in list:    # O(n) - searching a list!

Total: O(n²), not O(n)!

Fix: Use a set for O(1) lookup

2. Assuming O(1) Hash Operations

Hash tables are O(1) average, but O(n) worst case with bad hashing.

3. Thinking Big O Tells Exact Time

Big O describes growth, not absolute time.

FAQ

Q: Why do we care about worst case?

In production, you need to plan for the worst. A single bad input can crash your system.

Q: What about average and worst case in practice?

Important for context! Quick sort is O(n²) worst but O(n log n) average, so it's still used.

Q: Is O(n log n) closer to O(n) or O(n²)?

Much closer to O(n)! As n grows, O(n log n) barely exceeds O(n).

Q: How do I improve my algorithm's Big O?

Better data structures, smarter approaches (divide-and-conquer), or trade space for time.


Summary

Big O notation describes how algorithms scale as input grows.

Key Takeaways:

  • O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2^n) < O(n!)
  • Drop constants and lower-order terms
  • Add for sequential, multiply for nested
  • Focus on the dominant operation
  • The difference becomes massive at scale
  • Also applies to space (memory usage)

Big O is the language of algorithm efficiency - master it!

Leave a Comment

Comments (0)

Be the first to comment on this concept.

Comments are approved automatically.