Next: Lecture 4 - heapsort Up: No Title Previous: Lecture 2 - asymptotic

# Lecture 3 - recurrence relations

Problem 2.1-2: Show that for any real constants a and b, b > 0,

To show , we must show O and . Go back to the definition!

• Big O - Must show that for all . When is this true? If , this is true for all n > |a| since n+a < 2n, and raise both sides to the b.
• Big - Must show that for all . When is this true? If , this is true for all n > 3|a|/2 since n+a > n/2, and raise both sides to the b.

Note the need for absolute values.

Problem 2.1-4:

(a) Is ?

(b) Is ?

(a) Is ?

Is ?

Yes, if for all n

(b) Is

Is ?

note

Is ?

Is ?

No! Certainly for any constant c we can find an n such that this is not true.

Recurrence Relations

Many algorithms, particularly divide and conquer algorithms, have time complexities which are naturally modeled by recurrence relations.

A recurrence relation is an equation which is defined in terms of itself.

Why are recurrences good things?

1. Many natural functions are easily expressed as recurrences:

2. It is often easy to find a recurrence as the solution of a counting problem. Solving the recurrence can be done for many special cases as we will see, although it is somewhat of an art.

Recursion is Mathematical Induction!

In both, we have general and boundary conditions, with the general condition breaking the problem into smaller and smaller pieces.

The initial or boundary condition terminate the recursion.

As we will see, induction provides a useful tool to solve recurrences - guess a solution and prove it by induction.

 n 0 1 2 3 4 5 6 7 0 1 3 7 15 31 63 127

Guess what the solution is?

Prove by induction:

1. Show that the basis is true: .
2. Now assume true for .
3. Using this assumption show:

Solving Recurrences

No general procedure for solving recurrence relations is known, which is why it is an art. My approach is:

Realize that linear, finite history, constant coefficient recurrences always can be solved

Check out any combinatorics or differential equations book for a procedure.

Consider , ,

It has history = 2, degree = 1, and coefficients of 2 and 1. Thus it can be solved mechanically! Proceed:

• Find the characteristic equation, eg.

• Solve to get roots, which appear in the exponents.
• Take care of repeated roots and inhomogeneous parts.
• Find the constants to finish the job.

Systems like Mathematica and Maple have packages for doing this.

Guess a solution and prove by induction

To guess the solution, play around with small values for insight.

Note that you can do inductive proofs with the big-O's notations - just be sure you use it right.

Example: .

Show that for large enough c and n. Assume that it is true for n/2, then

Starting with basis cases T(2)=4, T(3)=5, lets us complete the proof for .

Try backsubstituting until you know what is going on

Also known as the iteration method. Plug the recurrence back into itself until you see a pattern.

Example: .

Try backsubstituting:

The term should now be obvious.

Although there are only terms before we get to T(1), it doesn't hurt to sum them all since this is a fast growing geometric series:

Recursion Trees

Drawing a picture of the backsubstitution process gives you a idea of what is going on.

We must keep track of two things - (1) the size of the remaining argument to the recurrence, and (2) the additive stuff to be accumulated during this call.

Example:

The remaining arguments are on the left, the additive terms on the right.

Although this tree has height , the total sum at each level decreases geometrically, so:

The recursion tree framework made this much easier to see than with algebraic backsubstitution.

See if you can use the Master theorem to provide an instant asymptotic solution

The Master Theorem:   Let and b>1 be constants, let f(n) be a function, and let T(n) be defined on the nonnegative integers by the recurrence

where we interpret n/b as or . Then T(n) can be bounded asymptotically as follows:

1. If for some constant , then .
2. If , then .
3. If for some constant , and if for some constant c<1, and all sufficiently large n, then .

Examples of the Master Theorem

Which case of the Master Theorem applies?

• T(n) = 4 T(n/2) + n

Reading from the equation, a=4, b=2, and f(n) = n.

Is ?

Yes, so case 1 applies and .

• Reading from the equation, a=4, b=2, and .

Is ?

No, if , but it is true if , so case 2 applies and .

• Reading from the equation, a=4, b=2, and .

Is ?

Yes, for , so case 3 might apply.

Is ?

Yes, for , so there exists a c < 1 to satisfy the regularity condition, so case 3 applies and .

Why should the Master Theorem be true?

Consider T(n) = a T(n/b) + f(n).

Suppose f(n) is small enough

Say f(n)=0, ie. T(n) = a T(n/b).

Then we have a recursion tree where the only contribution is at the leaves.

There will be levels, with leaves at level l.

so long as f(n) is small enough that it is dwarfed by this, we have case 1 of the Master Theorem!

Suppose f(n) is large enough

If we draw the recursion tree for T(n) = a T(n/b) + f(n).

If f(n) is a big enough function, the one top call can be bigger than the sum of all the little calls.

Example: . In fact this holds unless !

In case 3 of the Master Theorem, the additive term dominates.

In case 2, both parts contribute equally, which is why the log pops up. It is (usually) what we want to have happen in a divide and conquer algorithm.

Famous Algorithms and their Recurrence

Matrix Multiplication

The standard matrix multiplication algorithm for two matrices is .

Strassen discovered a divide-and-conquer algorithm which takes time.

Since dwarfs , case 1 of the master theorem applies and .

This has been ``improved'' by more and more complicated recurrences until the current best in .

Polygon Triangulation

Given a polygon in the plane, add diagonals so that each face is a triangle None of the diagonals are allowed to cross.

Triangulation is an important first step in many geometric algorithms.

The simplest algorithm might be to try each pair of points and check if they see each other. If so, add the diagonal and recur on both halves, for a total of .

However, Chazelle gave an algorithm which runs in time. Since , by case 1 of the Master Theorem, Chazelle's algorithm is linear, ie. T(n) = O(n).

Sorting

The classic divide and conquer recurrence is Mergesort's T(n) = 2 T(n/2) + O(n), which divides the data into equal-sized halves and spends linear time merging the halves after they are sorted.

Since but not , Case 2 of the Master Theorem applies and .

In case 2, the divide and merge steps balance out perfectly, as we usually hope for from a divide-and-conquer algorithm.

Mergesort Animations

Approaches to Algorithms Design

Incremental

Job is partly done - do a little more, repeat until done.

A good example of this approach is insertion sort

Divide-and-Conquer

A recursive technique

• Divide problem into sub-problems of the same kind.
• For subproblems that are really small (trivial), solve them directly. Else solve them recursively. (conquer)
• Combine subproblem solutions to solve the whole thing (combine)

A good example of this approach is Mergesort.

Next: Lecture 4 - heapsort Up: No Title Previous: Lecture 2 - asymptotic

Algorithms
Mon Jun 2 09:21:39 EDT 1997