Fundamentals

Big O notation is a mathematical concept used to describe an algorithm’s efficiency in terms of time complexity (how the runtime grows with input size) and space complexity (how the memory usage grows with input size). Knowing Big O can help identify performance bottlenecks, optimize algorithms, and predict scalability.

Time Complexity

Time complexity measures the number of operations an algorithm performs relative to the input size, denoted as n. It’s not exact but provides an upper bound for the algorithm’s growth rate, which helps with worst-case planning.

Common Time Complexities

  • O(1) - Constant time: The runtime doesn’t grow with input size.
  • O(log n) - Logarithmic time: Example, binary search.
  • O(n) - Linear time: Iterating through an array.
  • O(n log n) - Linearithmic time: Often seen in efficient sorting algorithms (like merge sort).
  • O(n^2) - Quadratic time: Nested loops over the input (e.g., bubble sort).

Example: Time Complexity Analysis

def find_target(arr: list[int], target: int) -> int:
    """
    Returns the index of the target if found in arr, otherwise -1.
    Time Complexity: O(n), as we need to iterate through each element in the worst case.
    """
    for i, num in enumerate(arr):
        if num == target:
            return i
    return -1
  • Best-case (O(1)): If the target is the first element.
  • Worst-case (O(n)): If the target is the last element or not present.

A common mistake is assuming the best or average case applies universally. Interviewers expect a worst-case analysis to ensure a robust understanding of Big O.

Space Complexity

Space complexity measures the amount of memory required by the algorithm as the input size grows. It considers variables, data structures, and recursive calls.

Example: Space Complexity Analysis

def sum_elements(arr: list[int]) -> int:
    """
    Returns the sum of elements in the array.
    Time Complexity: O(n)
    Space Complexity: O(1), as only one integer (total) is used.
    """
    total = 0
    for num in arr:
        total += num
    return total
  • Here, space complexity is O(1) because the total is a single variable.
  • Gotcha: Recursive algorithms often have extra space complexity due to the call stack.

Best, Average, and Worst-Case Analysis

For interviews, it’s essential to understand how algorithms behave in different scenarios.

Example: Linear Search Best, Average, and Worst Cases

In the case of linear search:

- Best Case (O(1)): Target is the first element.

- Average Case (O(n/2)): Target is somewhere in the middle.

- Worst Case (O(n)): Target is the last element or missing.

Example: Worst Case in Sorting Algorithms

For sorting algorithms:

- Selection Sort has a worst-case time complexity of O(n^2).

- Merge Sort has a worst-case time complexity of O(n log n).

Step-by-Step Analysis Process

1. Identify the Input Size

First, determine what “n” represents - usually the size of your input data structure (array length, string length, etc.).

2. Count Operations in Loops

  • Single loop: O(n)
  • Nested loops: Multiply complexities
  • Sequential operations: Add complexities

3. Focus on Dominant Terms

Drop constants and lower-order terms. O(3n² + 2n + 1) becomes O(n²).

Common Patterns with Examples## How to Analyze Any Code

1. Identify Loops and Recursion

Look for for, while loops, and recursive function calls. These typically drive complexity.

2. Analyze Loop Relationships

  • Independent loops: Add complexities (O(n) + O(n) \= O(n))
  • Nested loops: Multiply complexities (O(n) × O(n) \= O(n²))
  • Dependent loops: Consider how inner loop bounds change

3. Handle Recursive Functions

Draw the recursion tree:

  • Depth: How many recursive calls deep?
  • Branching factor: How many recursive calls per level?
  • Work per call: What’s the non-recursive work?

Formula: O(branches^depth × work_per_call)

4. Consider Built-in Operations

Python operations have hidden complexity:

  • list.append(): O(1) amortized
  • list.insert(0, x): O(n)
  • sorted(): O(n log n)
  • x in list: O(n)
  • x in set: O(1) average case

Design Pattern Consideration

When using design patterns, consider their complexity implications:

  • Strategy Pattern: Complexity depends on the chosen strategy
  • Observer Pattern: O(n) notification to n observers
  • Decorator Pattern: Adds constant overhead O(1)
  • Factory Pattern: Usually O(1) object creation

The key is to analyze the actual algorithm implementation, not just the pattern structure.

Remember: Big O describes growth rate trends, not exact runtime. An O(n²) algorithm might outperform an O(n log n) algorithm for small inputs due to constants and implementation details.

Track your progress

Mark this subtopic as completed when you finish reading.