Recurrence

When analyzing the runtime of a recursive algorithm, you can write the total runtime in terms of the runtimes of the recursive calls, creating a recurrence relation.

E.g. the divide-and-conquer convex hull algorithm, with 3 main steps:

Split points into left and right halves                 # O(n)
Recursively find the convex hulls of each half          # 2T(n/2)
Merge the 2 hulls into a single one by adding tangents  # O(n)

So in total:

\[\begin{split}T(n) & =O(n)+2T(\frac{n}{2})+O(n) \\ & =2T(\frac{n}{2})+O(n)\end{split}\]

Note: A base case is required for recurrences; problem-specific (for convex-hull \(n \leq 3\) is the base case)

Solving

Recursion Tree

e.g. \(T(n)=2T(\frac{n}{2})+O(n)\)

Solving for the max depth of the base case: \(\frac{n}{2^d}\leq 3 \to 2^d \geq \frac{n}{3} \to d \geq \log_2(\frac{n}{3}) = \log_2(n) - \log_2(3) = \Theta(\log n)\)

_images/recurrence1.png

In conclusion, the work at each level is \(\Theta(n)\), with \(\Theta(\log n)\) levels, so the total runtime is \(\Theta(n \log n)\).

The recursion tree is used informally to guess the asymptotic solution of a recurrence, but it does not prove the solution works.

Induction/Substitution

Solve for a guess using some other method (e.g. recurrence tree), then use subtitution to prove.

e.g. \(T(n)=2T(n/2)+O(n)\)

  • guess that \(T(n)=O(n \log n)\)

  • need to prove \(\exists n_0, c > 0\) s.t. \(\forall n \geq n_0, T(n) \leq c n \log n\).

Fix \(n_0\) and \(c\), and prove \(T(n)\leq cn \log n\) for \(n \geq n_0\) by induction on n

  • Base case (\(n = n_0\)): Need \(T(n_0) \leq cn_0 \log n_0\)
    • Take \(n_0 > 1\), then pick c large enough to satisfy the inequality

  • Inductive case: Assume \(T(m) \leq cm \log m\) for all \(m < n\)
    • then \(T(n) = 2T(n/2)+O(n)\)

    • since \(n/2 < n\), \(T(n/2)\leq c \frac{n}{2} \log \frac{n}{2}\) (by IH)

    • \(T(n) \leq \frac{2cn}{2} \log \frac{n}{2} + dn\) (for some constant d)

    • \(T(n) \leq cn (\log \frac{n}{2}) + dn\)

    • \(= cn (\log n - \log 2) + dn\)

    • \(= cn \log n + n (d -c \log 2)\)

    • \(\leq cn \log n\) if \(c \geq \frac{d}{\log 2}\)

    • This establishes the inductive hypothesis at n.

  • By induction, \(T(n) \leq cn \log n\) for all \(n \geq n_0\).

If you wanted to prove \(\Theta(n)\), the proof would look similar to the proof above, but with the \(\leq\) and \(\geq\) reversed to prove \(\Omega(n)\).

Master Theorem

For recurrences of the form \(T(n) = aT(n/b) + f(n)\), we can use the master theorem for divide-and-conquer recurrences. (not proved in class - proof by recursion trees in CLRS)

Thm: If \(T(n)\) satisfies a recurrence \(T(n) = aT(n/b) + f(n)\) where \(a \geq 1, b > 1\), or the same recurrence with either \(T(\lfloor n/b \rfloor)\) or \(T(\lceil n/b \rceil)\), then defining \(c = \log_b a\), we have:

  1. If \(f(n) = O(n^{c-\epsilon})\) for some \(\epsilon > 0\), \(T(n) = \Theta(n^c)\)

  2. If \(f(n) = \Theta(n^c)\), \(T(n) = \Theta(n^c \log n)\)

  3. If \(f(n) = \Omega(n^{c+\epsilon})\) for some \(\epsilon > 0\) and \(af(n/b) \leq df(n)\) for some \(d < 1\) and sufficiently large n, \(T(n) = \Theta(f(n))\)

Examples

  • \(T(n) = 9T(n/3)+n\)
    • \(c = \log_3 9 = 2\)

    • Case 1: \(n = O(n^{2-\epsilon})\)

    • so \(T(n) = \Theta(n^2)\)

  • \(T(n) = T(2n/3)+1\)
    • \(c = \log_{3/2} 1 = 0\)

    • Case 2: \(1 = \Theta(n^0) = \Theta(1)\)

    • so \(T(n) = \Theta(\log n)\)

  • \(T(n) = 3T(n/4)+n^2\)
    • \(c = \log_4 3 < 1\)

    • Case 3: \(n^2 = \Omega(n^{c+1})\) (\(\epsilon = 1, c + 1 < 2\))

    • check: is \(3(n/4)^2 \leq dn^2\) for some d?
      • \(\frac{3}{16}n^2 \leq dn^2\) for \(d=\frac{3}{16}<1\)

    • so \(T(n)=\Theta(n^2)\)

  • \(T(n)=4T(n/2)+3n^2\)
    • \(c=\log_2 4 = 2\)

    • Case 2: \(3n^2=\Theta(n^2)\)

    • so \(T(n)=\Theta(n^2 \log n)\)

There is a generalization of the master theorem for differently-sized methods: the Akra-Bazzi method (not in this class)