Dynamic Programming (DP) is a technique for solving problems by breaking them down into simpler overlapping subproblems and storing their solutions to avoid redundant calculations. DP is especially useful for optimization problems where we aim to minimize or maximize a certain quantity, and it can significantly improve the efficiency of algorithms that have overlapping subproblems.
There are two primary approaches in DP:
1. Memoization (Top-Down Approach)
2. Tabulation (Bottom-Up Approach)
Additionally, there are common patterns in DP that are widely applicable across different problems.
Memoization (Top-Down Approach)
Memoization is a top-down approach where we solve the problem recursively and store the results of subproblems in a table (typically a dictionary or array) to avoid redundant calculations. Each subproblem is only solved once, and its result is stored for future reference.
Example: Fibonacci Sequence with Memoization
The Fibonacci sequence is a classic DP problem where each number is the sum of the two preceding ones.
def fibonacci_memo(n: int, memo={}) -> int:
# Base cases
if n <= 1:
return n
# Check if the result is already computed
if n in memo:
return memo[n]
# Recursive case with memoization
memo[n] = fibonacci_memo(n - 1, memo) + fibonacci_memo(n - 2, memo)
return memo[n]
# Example usage
print(fibonacci_memo(10)) # Output: 55
Explanation
- Base Cases: Return n if n is 0 or 1.
- Memo Check: If n is in the memo dictionary, return the stored result.
- Store Result: Compute the result, store it in the memo, and return it.
Benefits of Memoization
- Avoids recalculating results for the same subproblem, reducing time complexity.
- Useful for problems with complex recursion trees where subproblems overlap significantly.
Tabulation (Bottom-Up Approach)
Tabulation is a bottom-up approach where we solve smaller subproblems first, store their results in a table, and then build up to the desired solution. Tabulation generally avoids the overhead of recursion and can sometimes be faster and more space-efficient than memoization.
Example: Fibonacci Sequence with Tabulation
def fibonacci_tab(n: int) -> int:
# Edge cases
if n <= 1:
return n
# Initialize table to store Fibonacci numbers
table = [0] * (n + 1)
table[1] = 1
# Fill the table iteratively
for i in range(2, n + 1):
table[i] = table[i - 1] + table[i - 2]
return table[n]
# Example usage
print(fibonacci_tab(10)) # Output: 55
Explanation
- Initialize Base Values: table[0] is 0 and table[1] is 1.
- Fill Table: Each value is calculated based on previous values in the table.
- Return Final Value: table[n] contains the result.
Benefits of Tabulation
- Avoids recursion overhead, making it faster in practice for some problems.
- Ensures all subproblems are calculated in a specific order, which can be beneficial for certain DP problems.
Common DP Patterns
Dynamic programming problems often fall into recognizable patterns. Recognizing these patterns can help in designing the solution more efficiently. Here are some common DP patterns:
Pattern 1: Fibonacci Sequence and DP on Linear Structures
This pattern involves problems where the solution to the problem is the sum or combination of solutions to previous subproblems in a linear sequence.
- Examples: Fibonacci sequence, climbing stairs problem, integer break.
- Approach: Use a 1D array (or two variables if only two previous values are required) to store the results of previous computations.
def climb_stairs(n: int) -> int:
if n <= 1:
return 1
# Use two variables instead of an array for O(1) space complexity
a, b = 1, 2
for _ in range(2, n):
a, b = b, a + b
return b
Pattern 2: Knapsack Pattern (Subset Sum)
- In knapsack-type problems, the goal is to make decisions about including or excluding items while optimizing a certain metric (e.g., maximizing profit or minimizing weight).
- Examples: 0/1 knapsack, subset sum, partition equal subset sum.
- Approach: Use a 2D DP table where dp[i][j] represents the maximum achievable value with i items and a capacity of j.
def knapsack(values: List[int], weights: List[int], capacity: int) -> int:
n = len(values)
dp = [[0] * (capacity + 1) for _ in range(n + 1)]
for i in range(1, n + 1):
for w in range(capacity + 1):
if weights[i - 1] <= w:
dp[i][w] = max(dp[i - 1][w], values[i - 1] + dp[i - 1][w - weights[i - 1]])
else:
dp[i][w] = dp[i - 1][w]
return dp[n][capacity]
Pattern 3: Longest Common Subsequence (LCS)
The LCS pattern involves finding the longest sequence that is common between two sequences.
- Examples: Longest common subsequence, edit distance, longest palindromic subsequence.
- Approach: Use a 2D table where dp[i][j] represents the solution to the problem for substrings of lengths i and j.
def longest_common_subsequence(s1: str, s2: str) -> int:
m, n = len(s1), len(s2)
dp = [[0] * (n + 1) for _ in range(m + 1)]
for i in range(1, m + 1):
for j in range(1, n + 1):
if s1[i - 1] == s2[j - 1]:
dp[i][j] = dp[i - 1][j - 1] + 1
else:
dp[i][j] = max(dp[i - 1][j], dp[i][j - 1])
return dp[m][n]
Pattern 4: Optimal Substructure Problems
Problems where each subproblem’s solution contributes to solving larger problems often have optimal substructure. Examples include matrix chain multiplication and cutting rod problems.
- Examples: Matrix chain multiplication, burst balloons.
- Approach: Define dp[i][j] to represent the optimal solution for a subarray or subproblem, iteratively building up from small subproblems.
def matrix_chain_multiplication(dims: List[int]) -> int:
n = len(dims) - 1
dp = [[0] * n for _ in range(n)]
for l in range(2, n + 1): # l is the chain length
for i in range(n - l + 1):
j = i + l - 1
dp[i][j] = float('inf')
for k in range(i, j):
cost = dp[i][k] + dp[k + 1][j] + dims[i] * dims[k + 1] * dims[j + 1]
dp[i][j] = min(dp[i][j], cost)
return dp[0][n - 1]
When to Use Dynamic Programming
- Overlapping Subproblems: If a problem can be divided into subproblems that are solved multiple times, DP can avoid redundant calculations.
- Optimal Substructure: If the solution to a problem can be constructed from the solutions to its subproblems, DP is often applicable.