Patterns & Tips
This lesson consolidates everything into copy-paste-ready templates, a decision framework for choosing the right backtracking variant, and answers to frequently asked questions.
Template 1: Permutations
Use when order matters and every element must be used exactly once.
def permutation_template(nums):
"""Permutation template: use visited set, iterate from 0.
- Result collected when path length equals input length
- For unique permutations: sort + skip duplicates
"""
result = []
def backtrack(path, used):
if len(path) == len(nums):
result.append(path[:])
return
for i in range(len(nums)):
if i in used:
continue
# For unique: if i > 0 and nums[i] == nums[i-1]
# and (i-1) not in used: continue
used.add(i)
path.append(nums[i])
backtrack(path, used)
path.pop()
used.remove(i)
backtrack([], set())
return result
Template 2: Combinations
Use when order does not matter and you pick k elements from n.
def combination_template(n, k):
"""Combination template: use start index, no revisiting.
- Result collected when path length equals k
- Upper bound pruning: i can go at most to n - (k - len(path)) + 1
"""
result = []
def backtrack(start, path):
if len(path) == k:
result.append(path[:])
return
for i in range(start, n - (k - len(path)) + 2):
path.append(i)
backtrack(i + 1, path)
path.pop()
backtrack(1, [])
return result
Template 3: Subsets
Use when you need all possible subsets (power set).
def subset_template(nums):
"""Subset template: collect at every node, not just leaves.
- Result added at every recursion level
- For unique: sort + skip duplicates at same level
"""
result = []
def backtrack(start, path):
result.append(path[:]) # collect at every node
for i in range(start, len(nums)):
# For unique: if i > start and nums[i] == nums[i-1]: continue
path.append(nums[i])
backtrack(i + 1, path)
path.pop()
backtrack(0, [])
return result
Template 4: Grid/Matrix Backtracking
Use for path finding, word search, and constraint satisfaction on 2D grids.
def grid_backtrack_template(grid, start_r, start_c):
"""Grid backtracking template: in-place marking.
- Mark current cell visited (modify grid or use set)
- Explore all valid neighbors
- Unmark on backtrack
"""
rows, cols = len(grid), len(grid[0])
def backtrack(r, c, state):
# Base case: check if goal reached
if is_goal(r, c, state):
record_solution(state)
return
# Mark visited
original = grid[r][c]
grid[r][c] = '#' # or add to visited set
# Explore neighbors
for dr, dc in [(0, 1), (0, -1), (1, 0), (-1, 0)]:
nr, nc = r + dr, c + dc
if (0 <= nr < rows and 0 <= nc < cols and
grid[nr][nc] != '#' and is_valid(nr, nc, state)):
backtrack(nr, nc, updated_state)
# Unmark (backtrack)
grid[r][c] = original
backtrack(start_r, start_c, initial_state)
Template 5: String Partitioning
Use for palindrome partitioning, word break, IP address restoration.
def string_partition_template(s):
"""String partition template: try all prefix lengths.
- At each position, try segments of length 1, 2, 3, ...
- Validate segment before recursing
- Recurse on remaining string
"""
result = []
def backtrack(start, parts):
if start == len(s):
result.append(parts[:])
return
for end in range(start + 1, len(s) + 1):
segment = s[start:end]
if is_valid_segment(segment):
parts.append(segment)
backtrack(end, parts)
parts.pop()
backtrack(0, [])
return result
Template 6: Constraint Satisfaction (CSP)
Use for Sudoku, N-Queens, and scheduling problems.
def csp_template(variables, domains, constraints):
"""CSP template: assign values to variables under constraints.
- Find next unassigned variable
- Try each value in its domain
- Check constraints before assigning
- Backtrack if no valid value found
"""
def backtrack(assignment):
if len(assignment) == len(variables):
return assignment.copy() # solution found
var = select_unassigned(variables, assignment)
for value in domains[var]:
if is_consistent(var, value, assignment, constraints):
assignment[var] = value
result = backtrack(assignment)
if result is not None:
return result
del assignment[var]
return None # no valid assignment
return backtrack({})
Decision Framework: Which Template to Use?
Follow this decision tree when you see a new backtracking problem:
- Does order matter?
- Yes → Permutation template (visited set, iterate from 0)
- No → go to step 2
- Do you need all subsets or fixed-size selections?
- All subsets → Subset template (collect at every node)
- Fixed k elements → Combination template (collect at depth k)
- Sum constraint → Combination Sum template (collect when sum = target)
- Is it a 2D grid problem?
- Path finding → Grid template with in-place marking
- Placement (N-Queens) → CSP template with constraint sets
- Is it a string partitioning problem?
- Yes → String Partition template
- Are there overlapping subproblems?
- Yes → Add memoization to any template above
- Need only count/optimal → Convert to DP
- Are there duplicates in input?
- Yes → Sort + skip duplicates at same recursion level
Common Mistakes to Avoid
| Mistake | Fix |
|---|---|
| Forgetting to copy the path before appending to result | Always use result.append(path[:]) not result.append(path) |
Not backtracking (forgetting path.pop()) | Every append must have a matching pop |
Using continue instead of break for sorted pruning | break prunes the entire remaining subtree; continue only skips one element |
| Wrong start index for combinations vs permutations | Combinations: start = i + 1. Permutations: always start from 0 with visited set |
| Modifying the input without restoring it | Always restore the original value after backtracking (grid problems) |
| Stack overflow on deep recursion in Python | Use sys.setrecursionlimit() or convert to iterative with explicit stack |
| Not handling empty input | Add if not nums: return [[]] or return [] depending on the problem |
| Duplicate results with duplicate inputs | Sort the input, then skip if i > start and nums[i] == nums[i-1] |
Interview Execution Tips
1. Draw the Decision Tree
Before coding, draw the recursion tree for a small input (n=3). Show the interviewer the branching, base cases, and where pruning happens. This demonstrates clear thinking.
2. State the Template
Say: "This is a subset/permutation/CSP problem. I will use the [template name] with [specific modification]." This shows pattern recognition and structured problem solving.
3. Start Brute Force, Then Optimize
Write the basic backtracking first, then add pruning. Say: "This works but explores too many branches. Let me add [pruning technique] to reduce the search space."
4. Analyze Complexity Carefully
Backtracking complexity is tricky. State the worst case (exponential), then explain how pruning reduces it in practice. Mention the output size as a lower bound.
Frequently Asked Questions
Use backtracking when you need to enumerate all solutions (all permutations, all valid configurations). The full path matters, and you cannot reduce to subproblems.
Use dynamic programming when you need a count or optimal value, the problem has overlapping subproblems, and the answer depends on a compact state (not the full path).
Hybrid approach: Start with backtracking, then add memoization if you notice the same state being computed multiple times. This is top-down DP.
Python's default recursion limit is 1000 frames. For deep recursion:
sys.setrecursionlimit(10000)— increase the limit (risky for very deep recursion)- Convert to iterative backtracking using an explicit stack: push (state, action) tuples and pop them in a while loop
- Use tail recursion elimination manually by converting the last recursive call to a loop
In interviews, mention the limit and say you would use sys.setrecursionlimit() or convert to iterative if needed. Most interview problems have recursion depth under 100.
In Python, lists are passed by reference. If you append path to result, you are appending a reference to the same list object. As backtracking modifies path (appending and popping), all references in result will reflect those changes.
path[:] creates a shallow copy — a new list with the same elements at that moment in time. This snapshot is preserved even as the original path continues to be modified.
Alternatives: list(path), path.copy(), or copy.copy(path). All produce the same result. For strings, this is not an issue because strings are immutable.
Time complexity of backtracking = number of nodes in the recursion tree × work per node.
- Permutations: n! leaves, O(n) work per leaf to copy → O(n! × n)
- Subsets: 2n nodes, O(n) work per node to copy → O(2n × n)
- Combinations C(n,k): C(n,k) leaves, O(k) work each → O(C(n,k) × k)
- With pruning: the theoretical bound does not change, but the constant factor drops dramatically. Mention both the worst case and the practical impact.
Lower bound: The output size is always a lower bound. If there are 2n subsets, you cannot do better than O(2n) because you must generate all of them.
Based on frequency data from LeetCode and interview reports:
- Permutations (LC 46) — asked at Google, Meta, Amazon, Microsoft
- Subsets (LC 78) — asked at Meta, Google, Bloomberg
- Combination Sum (LC 39) — asked at Amazon, Google, Uber
- Generate Parentheses (LC 22) — asked at Google, Meta, Microsoft, Amazon
- Word Search (LC 79) — asked at Amazon, Microsoft, Meta
- N-Queens (LC 51) — asked at Google, Apple, Goldman Sachs
- Letter Combinations (LC 17) — asked at Amazon, Google, Meta
- Palindrome Partitioning (LC 131) — asked at Google, Amazon
Master these eight problems and you will cover the vast majority of backtracking questions in real interviews.
Backtracking and recursive search are foundational to many production ML systems:
- Hyperparameter search: Grid search explores a tree of hyperparameter combinations. Bayesian optimization (Optuna, Hyperopt) uses tree-structured search with early stopping as pruning.
- Neural Architecture Search: NAS explores architectures by choosing operations at each layer. ENAS and DARTS use weight-sharing and gradient-based pruning to avoid exhaustive backtracking.
- Constraint satisfaction: ML pipeline schedulers (Kubeflow, Airflow) use CSP solvers with backtracking to schedule tasks under resource constraints.
- Game AI: Minimax with alpha-beta pruning (chess engines) is backtracking on game trees. MCTS (AlphaGo) combines backtracking with random rollouts.
- Tokenization: BPE and WordPiece tokenizers segment text by exploring all valid segmentations — essentially backtracking over substring choices.
- Feature selection: Exhaustive feature subset selection is a subset enumeration problem. Practical methods (sequential, genetic) use heuristic pruning.
Complete Problem Index
| # | Problem | Pattern | Difficulty | Lesson |
|---|---|---|---|---|
| 1 | Permutations (LC 46) | Permutation | Medium | Lesson 2 |
| 2 | Permutations II (LC 47) | Permutation + Dedup | Medium | Lesson 2 |
| 3 | Combinations (LC 77) | Combination | Medium | Lesson 2 |
| 4 | Combination Sum (LC 39) | Combination + Reuse | Medium | Lesson 2 |
| 5 | Letter Case Permutation (LC 784) | Binary Branch | Medium | Lesson 2 |
| 6 | Subsets (LC 78) | Subset | Medium | Lesson 3 |
| 7 | Subsets II (LC 90) | Subset + Dedup | Medium | Lesson 3 |
| 8 | Combination Sum II (LC 40) | Subset + Sum + Dedup | Medium | Lesson 3 |
| 9 | Subset Sum (LC 494 variant) | Subset + Memo | Medium | Lesson 3 |
| 10 | Partition K Equal Subsets (LC 698) | Partition + Symmetry | Hard | Lesson 3 |
| 11 | Word Search (LC 79) | Grid Backtrack | Medium | Lesson 4 |
| 12 | N-Queens (LC 51) | CSP | Hard | Lesson 4 |
| 13 | Sudoku Solver (LC 37) | CSP | Hard | Lesson 4 |
| 14 | Rat in a Maze | Grid Path | Medium | Lesson 4 |
| 15 | Unique Paths with Obstacles (LC 63) | Grid + Memo | Medium | Lesson 4 |
| 16 | Palindrome Partitioning (LC 131) | String Partition | Medium | Lesson 5 |
| 17 | Generate Parentheses (LC 22) | Constrained Generation | Medium | Lesson 5 |
| 18 | Letter Combinations (LC 17) | Mapping + Backtrack | Medium | Lesson 5 |
| 19 | Restore IP Addresses (LC 93) | String Partition | Medium | Lesson 5 |
| 20 | Word Break II (LC 140) | Partition + Memo | Hard | Lesson 5 |
Lilly Tech Systems