DSA Optimization Techniques

Below is a breakdown of essential optimization techniques used in arrays, graphs, dynamic programming, and advanced data structures.


πŸ”Ή 1. Preprocessing Techniques

πŸ‘‰ Use extra space to reduce time complexity.

βœ… Prefix Sum (O(1) query, O(n) preprocessing)

  • Problem: Given an array nums[], compute the sum of elements between indices [L, R] efficiently.
  • Optimization: Instead of recalculating the sum repeatedly, use a prefix sum array.
  • Example:
  prefix = [0] * (n+1)
  for i in range(n):
      prefix[i+1] = prefix[i] + nums[i]  # Precompute prefix sums

βœ… Suffix Sum (Used for right-to-left computations, similar to prefix sum).

βœ… Hashing (Precompute Frequencies/Occurrences)

  • Convert O(N) lookups into O(1) hash table lookups.

βœ… Sparse Tables (O(1) range queries, O(n log n) preprocessing).


πŸ”Ή 2. Two Pointers / Sliding Window

πŸ‘‰ Reduce nested loops by maintaining two pointers.

βœ… Sliding Window

  • Used in subarray problems (e.g., max sum, longest substring).
  • Avoids unnecessary recomputation.

πŸ”Ή Example (Find longest substring with at most k distinct characters):

def longest_substr_k_distinct(s, k):
    count = {}
    left = 0
    max_len = 0

    for right in range(len(s)):
        count[s[right]] = count.get(s[right], 0) + 1

        while len(count) > k:
            count[s[left]] -= 1
            if count[s[left]] == 0:
                del count[s[left]]
            left += 1

        max_len = max(max_len, right - left + 1)

    return max_len

πŸ’‘ Optimized from O(NΒ²) brute-force to O(N) using sliding window.

βœ… Fast-Slow Pointers

  • Detects cycles in Linked Lists.
  • Used in finding middle of a list (O(N) time, O(1) space).

πŸ”Ή 3. Greedy Optimization

πŸ‘‰ Make locally optimal choices to get a globally optimal solution.

βœ… Activity Selection Problem (Sorting + Greedy)
βœ… Huffman Encoding (Min-Heap Optimization)
βœ… Dijkstra’s Algorithm (Greedy + Priority Queue)

Example: Jump Game II (Greedy O(N) vs DP O(NΒ²))

def jump(nums):
    jumps = 0
    curr_end = 0
    farthest = 0

    for i in range(len(nums) - 1):
        farthest = max(farthest, i + nums[i])
        if i == curr_end:
            jumps += 1
            curr_end = farthest

    return jumps

πŸ’‘ Optimized from O(NΒ²) (DP) to O(N) using a Greedy approach.


πŸ”Ή 4. Dynamic Programming (Memoization & Tabulation)

πŸ‘‰ Reduce exponential time complexity by caching results.

βœ… Memoization (Top-Down DP) – Recursive + Cache
βœ… Tabulation (Bottom-Up DP) – Iterative

Example: Fibonacci Sequence (Optimized DP O(N) vs Recursion O(2^N))

def fib(n, memo={}):
    if n <= 1:
        return n
    if n not in memo:
        memo[n] = fib(n-1, memo) + fib(n-2, memo)  # Store results
    return memo[n]

πŸ’‘ Avoids recomputation using memoization.


πŸ”Ή 5. Binary Search Optimization

πŸ‘‰ Used in searching, finding lower/upper bounds, and optimization problems.

βœ… Binary Search on Sorted Arrays
βœ… Binary Search on Answer (Minimized Search Space)

  • Example: Find smallest divisor such that sum of divisions ≀ threshold.
def smallestDivisor(nums, threshold):
    left, right = 1, max(nums)

    while left < right:
        mid = (left + right) // 2
        if sum((num + mid - 1) // mid for num in nums) > threshold:
            left = mid + 1
        else:
            right = mid
    return left

πŸ’‘ Optimized from O(NΒ²) to O(N log M) using binary search on the divisor.


πŸ”Ή 6. Graph Optimizations

πŸ‘‰ Avoid recomputation in traversal-heavy problems.

βœ… Dijkstra’s Algorithm (Optimized with Min-Heap) O(E log V)
βœ… A* Algorithm (Heuristic-based optimization of Dijkstra)
βœ… Union-Find with Path Compression (O(Ξ±(N)) β‰ˆ O(1))
βœ… Topological Sorting (Kahn’s Algorithm O(V + E))

Example: Graph BFS with Priority Queue Optimization

import heapq

def dijkstra(graph, start):
    heap = [(0, start)]  # (distance, node)
    distances = {node: float('inf') for node in graph}
    distances[start] = 0

    while heap:
        curr_dist, node = heapq.heappop(heap)

        for neighbor, weight in graph[node]:
            new_dist = curr_dist + weight
            if new_dist < distances[neighbor]:
                distances[neighbor] = new_dist
                heapq.heappush(heap, (new_dist, neighbor))

    return distances

πŸ’‘ Optimized from O(VΒ²) to O(E log V) using a priority queue.


πŸ”Ή 7. Bit Manipulation Optimization

πŸ‘‰ Optimize space and operations using bits.

βœ… Check if a number is a power of 2 (n & (n-1) == 0)
βœ… Find single non-repeating element (XOR all numbers)
βœ… Count number of set bits (Brian Kernighan’s Algorithm)

Example: Find Single Non-Duplicate Number in O(N) Time & O(1) Space

def singleNumber(nums):
    res = 0
    for num in nums:
        res ^= num  # XOR cancels out duplicates
    return res

πŸ’‘ Optimized from O(N) space (HashMap) to O(1) using XOR.


πŸ”₯ Summary

OptimizationTechniqueTime Complexity Gain
PreprocessingPrefix/Suffix Sum, HashingO(N) β†’ O(1) per query
Sliding WindowTwo PointersO(NΒ²) β†’ O(N)
GreedyActivity Selection, DijkstraO(NΒ²) β†’ O(N log N)
Dynamic ProgrammingMemoization, TabulationO(2^N) β†’ O(N)
Binary SearchLower/Upper BoundO(N) β†’ O(log N)
Graph AlgorithmsBFS, DFS, DijkstraO(VΒ²) β†’ O(E log V)
Bit ManipulationXOR, Power of 2O(N) β†’ O(1)

πŸ”Ή Final Tip:

When solving DSA problems, always start with brute force, analyze time complexity, and then apply one or more of these optimizations to improve performance.


πŸ”₯ Would you like a curated list of FAANG problems for each optimization technique? Let me know! πŸš€

Leave a Reply