The Pizza Party Analogy
You need to count guests at a stadium with 100,000 people:
Naive approach: Count one by one. Takes forever!
Divide and conquer:
- Divide: Split stadium into 4 sections
- Conquer: Each section leader counts their section
- Combine: Add up the 4 numbers
Even better: Each section splits into subsections, and so on!
Divide and conquer breaks big problems into identical smaller problems, solves them independently, and combines the results.
The Three Steps
1. DIVIDE: Split the problem into smaller subproblems
2. CONQUER: Solve each subproblem recursively
3. COMBINE: Merge the solutions into final answer
The Base Case
Stop dividing when the problem is trivially small:
Counting people:
- Stadium of 1 person → just count: 1
- Stadium of 2 people → count each: 1 + 1 = 2
- Stadium of 100,000 → divide into sections first
Classic Example: Merge Sort
The Problem
Sort an unsorted array.
The Approach
Array: [38, 27, 43, 3, 9, 82, 10]
DIVIDE: Split in half
[38, 27, 43, 3] [9, 82, 10]
KEEP DIVIDING:
[38, 27] [43, 3] [9, 82] [10]
[38][27] [43][3] [9][82] [10]
BASE CASE: Arrays of size 1 are sorted!
COMBINE (merge sorted halves):
[27,38] [3,43] [9,82] [10]
[3,27,38,43] [9,10,82]
[3,9,10,27,38,43,82]
Why It Works
- Each split takes O(1)
- We split log n times (halving each time)
- Each merge level takes O(n) total
- Total: O(n log n)
Classic Example: Binary Search
The Problem
Find a target in a sorted array.
The Approach
Array: [2, 5, 8, 12, 16, 23, 38, 56, 72, 91]
Target: 23
DIVIDE: Check middle element (16)
23 > 16 → search the right half
[23, 38, 56, 72, 91]
CHECK MIDDLE (56):
23 < 56 → search left half
[23, 38]
CHECK MIDDLE (23):
23 == 23 → Found!
Each step eliminates half the remaining elements → O(log n)
The Master Theorem (Simplified)
For divide and conquer recurrences:
T(n) = a × T(n/b) + O(n^d)
Where:
a = number of subproblems
b = factor by which size shrinks
d = cost of divide/combine step
Common Patterns
| Algorithm | Recurrence | Result |
|---|---|---|
| Binary Search | T(n) = T(n/2) + O(1) | O(log n) |
| Merge Sort | T(n) = 2T(n/2) + O(n) | O(n log n) |
| Matrix Multiply | T(n) = 8T(n/2) + O(n²) | O(n³) |
| Strassen's | T(n) = 7T(n/2) + O(n²) | O(n^(log₂ 7)) |
Common Divide and Conquer Problems
1. Maximum Subarray
Find contiguous subarray with largest sum.
Divide: Split array in half
Conquer: Find max in left half, right half
Combine: Also check max crossing the middle
O(n log n) divide-and-conquer (though O(n) Kadane's is better)
2. Count Inversions
Count pairs (i,j) where i < j but arr[i] > arr[j].
Modify merge sort:
During merge, count how many elements from right half
are smaller than elements from left half.
3. Closest Pair of Points
Find two closest points in 2D plane.
Divide: Split points by x-coordinate
Conquer: Find closest pair in each half
Combine: Check pairs crossing the dividing line
4. Karatsuba Multiplication
Multiply two n-digit numbers with fewer than O(n²) single-digit multiplications (in theory).
Normal: 1234 × 5678 requires n² single-digit multiplications
Karatsuba: 3 recursive multiplications of n/2 digits
Result: O(n^(log₂ 3)) instead of O(n²)
Divide and Conquer vs Dynamic Programming
| Divide and Conquer | Dynamic Programming |
|---|---|
| Subproblems are independent | Subproblems overlap |
| No reuse of solutions | Solutions are cached |
| Typically top-down | Top-down or bottom-up |
| Merge sort, quick sort | Fibonacci, knapsack |
Key Difference
Divide and Conquer: Each subproblem is solved once
Merge left half, merge right half, combine.
Left and right are DIFFERENT parts.
Dynamic Programming: Same subproblems recur
fib(5) needs fib(4) needs fib(3)
fib(5) also needs fib(3)
Same subproblem appears multiple times!
When to Use Divide and Conquer
Good Candidates
- Problem can be split into identical smaller problems
- Subproblems are independent (no overlap)
- Results can be combined efficiently
Common Signals
✓ "Split in half"
✓ "Combine sorted results"
✓ "Reduce search space"
✓ "Recursive structure"
When NOT to Use
- Overlapping subproblems → use DP
- No clear division point
- Combining is as hard as original problem
Common Mistakes
1. Forgetting the Base Case
Every recursion needs a stopping condition!
2. Inefficient Combine Step
If combining is O(n²), your total might still be O(n²).
3. Unequal Splits
Try to divide evenly. Lopsided splits can increase complexity.
4. Not Recognizing Overlapping Subproblems
If you're solving the same subproblem multiple times, add memoization or switch to DP.
FAQ
Q: Is binary search divide and conquer?
Yes! It divides the problem in half and solves one subproblem (conquer), with trivial combining.
Q: What's the typical time complexity?
Often O(n log n) - dividing log n times and doing O(n) work at each level.
Q: Quick sort vs merge sort?
Both are divide-and-conquer. Quick sort divides first (partition), merge sort combines later (merge).
Q: Can I usually use divide and conquer?
It depends. The problem needs a natural way to split AND combine, and some problems don’t.
Summary
Divide and conquer solves problems by breaking them into smaller identical problems, solving each, and combining results.
Key Takeaways:
- Three steps: divide, conquer, combine
- Subproblems should be independent
- Often gives O(n log n) complexity
- Classic examples: merge sort, binary search, quick sort
- Different from DP: no overlapping subproblems
- Base case is essential to stop recursion
Master divide and conquer for efficient solutions to recursive problems!
Related Concepts
Leave a Comment
Comments (0)
Be the first to comment on this concept.
Comments are approved automatically.